Unnamed: 0
int64
0
14.2k
ReviewID
int64
2.14M
32.7M
Target
stringlengths
12
3.68k
Background
stringlengths
46
5.35k
Abstract
stringlengths
263
730k
11,000
30,119,907
If two conditions linked , once an individual undergo better control of asthma symptoms , might the excruciating migraine ease , too
OBJECTIVES We conducted this review to systematic ally assess the association and risk of the migraine in the patient with asthma and vice versa .
The Global Asthma Network ( GAN ) , established in 2012 , followed the International Study of Asthma and Allergies in Childhood ( ISAAC ) . ISAAC Phase One involved over 700 000 adolescents and children from 156 centres in 56 countries ; it found marked worldwide variation in symptom prevalence of asthma , rhinitis and eczema that was not explained by the current underst and ing of these diseases ; ISAAC Phase Three involved over 1 187 496 adolescents and children ( 237 centres in 98 countries ) . It found that asthma symptom prevalence was increasing in many locations especially in low- and middle-income countries where severity was also high , and identified several environmental factors that required further investigation . GAN Phase I , described in this article , builds on the ISAAC findings by collecting further information on asthma , rhinitis and eczema prevalence , severity , diagnoses , asthma emergency room visits , hospital admissions , management and use of asthma essential medicines . The subjects will be the same age groups as ISAAC , and their parents . In this first global monitoring of asthma in children and adults since 2003 , further evidence will be obtained to underst and asthma , management practice s and risk factors , leading to further recognition that asthma is an important non-communicable disease and to reduce its global burden . The Global Asthma Network undertakes global studies of asthma surveillance , management and risk factors The aim of this study was to assess the role of depression as a predictor of new onset of chronic migraine ( CM ) among persons with episodic migraine ( EM ) . The American Migraine Prevalence and Prevention ( AMPP ) study followed 24,000 persons with severe headache identified in 2004 . Using r and om-effects logistic regression , we modeled the probability that persons with EM in 2005 or 2006 would develop CM in the subsequent year . Depression was assessed in two ways , using a vali date d question naire ( PHQ-9 score ≥15 ) and based on self-reported medical diagnosis . Analyses were adjusted for multiple covariates including sociodemographics , body mass index , headache pain intensity , headache frequency , migraine symptom severity , cutaneous allodynia , acute medication overuse , anti-depressant use and anxiety . Of 6,657 participants with EM in 2005 , 160 ( 2.4 % ) developed CM in 2006 . Of 6,852 participants with EM in 2006 , 144 ( 2.2 % ) developed CM in 2007 . In fully adjusted models , PHQ-9 defined depression was a significant predictor of CM onset [ odds ratio ( OR ) = 1.65 , 95 % CI 1.12–2.45 ] . There was a depression-dose effect ; relative to participants with no depression or mild depression , those with moderate ( OR = 1.77 , 95 % CI 1.25–2.52 ) , moderately severe ( OR = 2.35 , 95 % CI 1.53–3.62 ) , and severe depression ( OR = 2.53 , 95 % CI 1.52–4.21 ) were at increased risk for the onset of CM . Among persons with EM , depression was associated with an increased risk of CM after adjusting for sociodemographic variables and headache characteristics . Depression preceded the onset of CM and risk increased with depression severity suggesting a potentially causal role though reverse causality can not be excluded BACKGROUND Data for trends in prevalence of asthma , allergic rhinoconjunctivitis , and eczema over time are scarce . We repeated the International Study of Asthma and Allergies in Childhood ( ISAAC ) at least 5 years after Phase One , to examine changes in the prevalence of symptoms of these disorders . METHODS For the ISAAC Phase Three study , between 2002 and 2003 , we did a cross-sectional question naire survey of 193,404 children aged 6 - 7 years from 66 centres in 37 countries , and 304,679 children aged 13 - 14 years from 106 centres in 56 countries , chosen from a r and om sample of schools in a defined geographical area . FINDINGS Phase Three was completed a mean of 7 years after Phase One . Most centres showed a change in prevalence of 1 or more SE for at least one disorder , with increases being twice as common as decreases , and increases being more common in the 6 - 7 year age-group than in the 13 - 14 year age-group , and at most levels of mean prevalence . An exception was asthma symptoms in the older age-group , in which decreases were more common at high prevalence . For both age-groups , more centres showed increases in all three disorders more often than showing decreases , but most centres had mixed changes . INTERPRETATION The rise in prevalence of symptoms in many centres is concerning , but the absence of increases in prevalence of asthma symptoms for centres with existing high prevalence in the older age-group is reassuring . The divergent trends in prevalence of symptoms of allergic diseases form the basis for further research into the causes of such disorders This cross-sectional clinical study was conducted in order to explore the relationship between atopic disorders and migraine . We evaluated 186 consecutive patients with migraine . Patients with a history of atopic disorders were compared with the others during headache-free intervals , for their headache characteristics , pulmonary test ( PFT ) performances and immunological screenings , through appropriate statistical methods . Of the patients with migraine , 77 ( 41.4 % ) reported at least one atopic disorder . PFT screening showed a general decreased pulmonary capacity and an important correlation between a positive history of atopic disorders and both increased eosinophil and IgE levels in headache-free periods . It should be discussed whether screening with PFT or immunological tests helps in early detection of progressive lung disease which might develop in these patients OBJECTIVE Clinical observation of a decrease in migraine frequency in patients with comorbid asthma taking montelukast , a specific D4 leukotriene receptor antagonist , or zafirlukast , another leukotriene receptor antagonist , prompted us to explore a possible role for leukotriene modifiers in the treatment of migraine . ( A further prompt was a pharmacist colleague 's observation that a number of patients on these agents reported a decreased sensitivity to perfume triggers and improvement in migraine . ) BACKGROUND Nonsteroidal anti-inflammatory agents have been used widely in the treatment of migraine . Another class of anti-inflammatory agents , known as leukotriene modifiers , have not been studied to date with regard to their possible role in the treatment of migraine . The name " leukotriene is derived both from the parent molecule , which was originally isolated from leukocytes , and from its three double-bond carbon backbone or triene structure . Both prostagl and ins and leukotrienes are derived from the metabolism of arachidonic acid , with prostagl and ins coming off the cyclooxygenase pathway and leukotrienes derived via the enzyme 5-lipoxygenase . Both prostagl and ins and leukotrienes mediate inflammatory responses . The latter have been studied with regard to their role in the pathophysiology of asthma . METHODS A prospect i ve , open-label study evaluating the efficacy of montelukast , 10 mg or 20 mg , in the prophylaxis of migraine in 17 patients is presented in this paper . All 17 patients completed the study that consisted of a 2-month baseline run-in period and a 3-month treatment phase . RESULTS Montelukast was extremely well tolerated , and no adverse events were reported by any of the patients . Fifty-three percent showed a reduction of greater than 50 % ( P<.025 ) in the frequency of severe attacks , with 41 % showing a reduction of greater than 60 % . Responders , including modest responders , rated the drug as excellent . CONCLUSIONS We conclude , given the limitations of an open-label study design and the small sample size , that montelukast shows potential as an effective , well-tolerated prophylactic agent in migraine . Double-blinded , placebo-controlled studies are warranted . In addition , the leukotrienes , as suggested previously in the literature , may play a role in the pathogenesis of migraine
11,001
30,188,385
Summary There is robust evidence that CBT-E is an effective treatment for patients with an eating disorder .
null
null
11,002
18,796,060
Parenting interventions , most commonly provided within the home , using multi-faceted interventions appear to be effective in reducing unintentional child injury .
null
null
11,003
32,062,764
The majority focused on behavioural and socioemotional problems or suicide risk , examined universal screening models , and used cross-sectional design s. In general , school-based programmes for identifying MHD aligned with schools ’ priorities , but their appropriateness for students varied by condition . Time , re source , and cost concerns were the most common barriers to feasibility across models and conditions .
Under- identification of mental health difficulties ( MHD ) in children and young people contributes to the significant unmet need for mental health care . School-based programmes have the potential to improve identification rates . This systematic review aim ed to determine the feasibility of various models of school-based identification of MHD .
This study employed a self-report question naire in a Solomon four-groups design to assess the efficacy of suicide intervention classes in achieving their instructional objectives . Because adolescents are often the first to know of a peer 's suicidal thoughts or plans , the goal of the classes was to increase the likelihood that students who come into contact with potentially suicidal peers can more readily identify them and will be consistently inclined to take responsible action on their behalf . Students who participated in the classes as compared to controls showed significant gains in relevant knowledge about suicidal peers and significantly more positive attitudes toward help seeking and intervening with troubled peers . Results of this study will be used to strengthen components of the lessons aim ed at enhancing the likelihood of performance of responsible interventions Background Implementation outcome measures are essential for monitoring and evaluating the success of implementation efforts . Yet , currently available measures lack conceptual clarity and have largely unknown reliability and validity . This study developed and psychometrically assessed three new measures : the Acceptability of Intervention Measure ( AIM ) , Intervention Appropriateness Measure ( IAM ) , and Feasibility of Intervention Measure ( FIM ) . Methods Thirty-six implementation scientists and 27 mental health professionals assigned 31 items to the constructs and rated their confidence in their assignments . The Wilcoxon one- sample signed rank test was used to assess substantive and discriminant content validity . Exploratory and confirmatory factor analysis ( EFA and CFA ) and Cronbach alphas were used to assess the validity of the conceptual model . Three hundred twenty-six mental health counselors read one of six r and omly assigned vignettes depicting a therapist contemplating adopting an evidence -based practice ( EBP ) . Participants used 15 items to rate the therapist ’s perceptions of the acceptability , appropriateness , and feasibility of adopting the EBP . CFA and Cronbach alphas were used to refine the scales , assess structural validity , and assess reliability . Analysis of variance ( ANOVA ) was used to assess known-groups validity . Finally , half of the counselors were r and omly assigned to receive the same vignette and the other half the opposite vignette ; and all were asked to re-rate acceptability , appropriateness , and feasibility . Pearson correlation coefficients were used to assess test-retest reliability and linear regression to assess sensitivity to change . Results All but five items exhibited substantive and discriminant content validity . A trimmed CFA with five items per construct exhibited acceptable model fit ( CFI = 0.98 , RMSEA = 0.08 ) and high factor loadings ( 0.79 to 0.94 ) . The alphas for 5-item scales were between 0.87 and 0.89 . Scale refinement based on measure-specific CFAs and Cronbach alphas using vignette data produced 4-item scales ( α ’s from 0.85 to 0.91 ) . A three-factor CFA exhibited acceptable fit ( CFI = 0.96 , RMSEA = 0.08 ) and high factor loadings ( 0.75 to 0.89 ) , indicating structural validity . ANOVA showed significant main effects , indicating known-groups validity . Test-retest reliability coefficients ranged from 0.73 to 0.88 . Regression analysis indicated each measure was sensitive to change in both directions . Conclusions The AIM , IAM , and FIM demonstrate promising psychometric properties . Predictive validity assessment is planned CONTEXT Universal screening for mental health problems and suicide risk is at the forefront of the national agenda for youth suicide prevention , yet no study has directly addressed the potential harm of suicide screening . OBJECTIVE To examine whether asking about suicidal ideation or behavior during a screening program creates distress or increases suicidal ideation among high school students generally or among high-risk students reporting depressive symptoms , substance use problems , or suicide attempts . DESIGN , SETTING , AND PARTICIPANTS A r and omized controlled study conducted within the context of a 2-day screening strategy . Participants were 2342 students in 6 high schools in New York State in 2002 - 2004 . Classes were r and omized to an experimental group ( n = 1172 ) , which received the first survey with suicide questions , or to a control group ( n = 1170 ) , which did not receive suicide questions . MAIN OUTCOME MEASURES Distress measured at the end of the first survey and at the beginning of the second survey 2 days after the first measured on the Profile of Mood States adolescent version ( POMS-A ) instrument . Suicidal ideation assessed in the second survey . RESULTS Experimental and control groups did not differ on distress levels immediately after the first survey ( mean [ SD ] POMS-A score , 5.5 [ 9.7 ] in the experimental group and 5.1 [ 10.0 ] in the control group ; P = .66 ) or 2 days later ( mean [ SD ] POMS-A score , 4.3 [ 9.0 ] in the experimental group and 3.9 [ 9.4 ] in the control group ; P = .41 ) , nor did rates of depressive feelings differ ( 13.3 % and 11.0 % , respectively ; P = .19 ) . Students exposed to suicide questions were no more likely to report suicidal ideation after the survey than unexposed students ( 4.7 % and 3.9 % , respectively ; P = .49 ) . High-risk students ( defined as those with depression symptoms , substance use problems , or any previous suicide attempt ) in the experimental group were neither more suicidal nor distressed than high-risk youth in the control group ; on the contrary , depressed students and previous suicide attempters in the experimental group appeared less distressed ( P = .01 ) and suicidal ( P = .02 ) , respectively , than high-risk control students . CONCLUSIONS No evidence of iatrogenic effects of suicide screening emerged . Screening in high schools is a safe component of youth suicide prevention efforts High school principals ' acceptability ratings of three school-based programs for the prevention of adolescent suicide were examined . From a r and om sample of members from the 1994 - 1995 membership directory of the National Association of Secondary School Principals ( NASSP ) , a total of 185 ( 40 % ) respondents completed the Suicide Prevention Program Rating Profile ( SPPRP ) , a measure design ed to evaluate the acceptability of suicide prevention programs , after reading a description of a particular prevention program . Programs evaluated for their acceptability included ( 1 ) curriculum-based programs presented to students , ( 2 ) in-service presentations to school staff , and ( 3 ) student self-report screening measures . The results indicated that the curriculum-based and staff in-service programs were significantly more acceptable to principals than was the schoolwide student screening program . No significant differences between the acceptability of curriculum-based and inservice programs were found . Limitations of the study and implication s for practice and research are discussed School-based depression screening and education programs are recommended for addressing the high rates of children ’s mental illness . The objectives of this study were to ( 1 ) identify Minnesota parent attitudes regarding the provision of school-based depression and suicide screening and education and ( 2 ) identify predictors of parent support for these school-based programs . A r and om sample of 1,300 Minnesota households with children ages 5–18 years was surveyed by mail . Chi-square tests and regression analyses were used to detect differences in parent support for depression and suicide screening and education across demographic categories , and parent beliefs and knowledge about depression and suicide . The response rate of eligible households was 43 % ( N = 511 ) . Overall , 84–89 % of parents supported school-based depression and suicide screening and education . After adjusting for all variables , parent support for depression screening was associated with greater knowledge [ OR 8.48 , CI(1.30–55.21 ) ] and fewer stigmatizing beliefs [ OR 0.03 , CI(0.01–0.12 ) ] . Support for suicide screening was associated with fewer stigmatizing beliefs [ OR 0.03 , CI(0.01 - 0.10 ) ] . Support for depression education was associated with fewer stigmatizing beliefs [ OR 0.32 , CI(0.10–1.00 ) ] and lower educational attainment [ OR 0.59 , CI(0.40–0.89 ) ] . Support for suicide education was associated with greater knowledge [ OR 7.99 , CI(1.02–62.68 ) ] , fewer stigmatizing beliefs [ OR 0.26 , CI(0.07–0.92 ) ] , and lower educational attainment [ OR 0.60 , CI(0.38–0.94 ) ] . Parent support for school-based depression and suicide screening and education was high . Parent education to decrease stigmatizing beliefs and increase knowledge about depression and suicide may increase support among the minority of parents who do not endorse such programs From a r and om sample of members of the 2000 - 2001 membership directory of the American Association of School Administrators ( AASA ) , public school administrators ' acceptability ratings of three school-based programs for the prevention of adolescent suicide were examined . A total of 210 ( 46 % ) respondents examined a description of a suicide prevention program and completed a measure design ed to evaluate the acceptability of suicide prevention programs . Three suicide prevention programs were evaluated for their acceptability , and included : ( a ) school-wide curriculum-based programs presented to students ; ( b ) in-service presentations to school staff ; and ( c ) self-report screening programs for students . The results indicated that superintendents rated the staff in-service training and curriculum-based programs as significantly more acceptable than the school-wide screening program . In addition , the school-wide screening program was rated as significantly more intrusive by school psychologists than the staff in-service training or curriculum-based prevention programs . Limitations of the study and future research directions are discussed BACKGROUND The Cochrane Collaboration is strongly encouraging the use of a newly developed tool , the Cochrane Collaboration Risk of Bias Tool ( CCRBT ) , for all review groups . However , the psychometric properties of this tool to date have yet to be described . Thus , the objective of this study was to add information about psychometric properties of the CCRBT including inter-rater reliability and concurrent validity , in comparison with the Effective Public Health Practice Project Quality Assessment Tool ( EPHPP ) . METHODS Both tools were used to assess the method ological quality of 20 r and omized controlled trials included in our systematic review of the effectiveness of knowledge translation interventions to improve the management of cancer pain . Each study assessment was completed independently by two review ers using each tool . We analysed the inter-rater reliability of each tool 's individual domains , as well as final grade assigned to each study . RESULTS The EPHPP had fair inter-rater agreement for individual domains and excellent agreement for the final grade . In contrast , the CCRBT had slight inter-rater agreement for individual domains and fair inter-rater agreement for final grade . Of interest , no agreement between the two tools was evident in their final grade assigned to each study . Although both tools were developed to assess ' quality of the evidence ' , they appear to measure different constructs . CONCLUSIONS Both tools performed quite differently when evaluating the risk of bias or method ological quality of studies in knowledge translation interventions for cancer pain . The newly introduced CCRBT assigned these studies a higher risk of bias . Its psychometric properties need to be more thoroughly vali date d , in a range of research fields , to underst and fully how to interpret results from its application Attendance and grade point average ( GPA ) data are universally maintained in school records and can potentially aid in identifying students with concealed behavioral problems , such as substance use . Research ers evaluated attendance ( truancy ) and GPA as a means to identify high school students at risk for substance use , suicide behaviors , and delinquency in 10 high schools in San Antonio , Texas , and San Francisco , California , during the spring and fall of 2002 . A screening protocol identified students as “ high risk ” if ( 1 ) in the top quartile for absences and below the median GPA or ( 2 ) teacher referred . Survey responses of 930 high-risk students were compared with those from a r and om sample of 393 “ typical ” students not meeting the protocol . Bivariate and multivariate analyses assessed associations between the screening protocol variables and demographics , risk and protective factors , and problem outcomes . The individual contribution of each of the variables was also assessed . Students identified as high risk were significantly more likely than typical students to use cigarettes , alcohol , and marijuana , evidence suicide risk factors , and engage in delinquent behavior . Norms varied between the two districts ; nevertheless , high-risk students showed consistent differences in risk and protective factors , as well as problem behaviors , compared with typical students . Because of site differences in data collection and teacher participation , the comprehensive protocol is recommended , rather than individual indicators alone ( e.g. , truancy ) . Strengths of the screening protocol are the ready availability of school record data , the ease of use of the adapted protocol , and the option of including teacher referral . More research is recommended to test the generalizability of the protocol and to ensure that there are no unintended negative effects associated with identification of students as high risk BACKGROUND Although school-based programmes for the identification of children and young people ( CYP ) with mental health difficulties ( MHD ) have the potential to improve short- and long-term outcomes across a range of mental disorders , the evidence -base on the effectiveness of these programmes is underdeveloped . In this systematic review , we sought to identify and synthesis e evidence on the effectiveness and cost-effectiveness of school-based methods to identify students experiencing MHD , as measured by accurate identification , referral rates , and service uptake . METHOD Electronic bibliographic data bases : MEDLINE , Embase , PsycINFO , ERIC , British Education Index and ASSIA were search ed . Comparative studies were included if they assessed the effectiveness or cost-effectiveness of strategies to identify students in formal education aged 3 - 18 years with MHD , presenting symptoms of mental ill health , or exposed to psychosocial risks that increase the likelihood of developing a MHD . RESULTS We identified 27 studies describing 44 unique identification programmes . Only one study was a r and omised controlled trial . Most studies evaluated the utility of universal screening programmes ; where comparison of identification rates was made , the comparator test varied across studies . The heterogeneity of studies , the absence of r and omised studies and poor outcome reporting make for a weak evidence -base that only generate tentative conclusions about the effectiveness of school-based identification programmes . CONCLUSIONS Well- design ed pragmatic trials that include the evaluation of cost-effectiveness and detailed process evaluations are necessary to establish the accuracy of different identification models , as well as their effectiveness in connecting students to appropriate support in real-world setting
11,004
32,322,540
Discussion While many scales are commonly used to measure verbal memory in first episode psychosis , they are not often administered via digital technology .
Background Even in the early phases of psychotic spectrum illnesses such as schizophrenia , patients can experience cognitive decline or deficits prior to the onset of psychotic symptoms such as delusions and hallucinations . In this systematic review , we assessed which verbal memory assessment s are most widely used in first-episode psychosis and may be applied via digital technologies ( smartphone applications , etc . ) for use in early detection .
Background Cognitive remediation ( CR ) is an effective treatment for several psychiatric disorders . To date , there have been no published studies examining solely first-episode psychiatric cohorts , despite the merits demonstrated by early intervention CR studies . The current study aim ed to assess the effectiveness of CR in patients with a first-episode of either major depression or psychosis . Method Fifty-five patients ( mean age = 22.8 years , s.d . = 4.3 ) were r and omly assigned to either CR ( n = 28 ) or treatment as usual ( TAU ; n = 27 ) . CR involved once-weekly 2-h sessions for a total of 10 weeks . Patients were comprehensively assessed before and after treatment . Thirty-six patients completed the study , and analyses were conducted using an intent-to-treat ( ITT ) approach with all available data . Results In comparison to TAU , CR was associated with improved immediate learning and memory controlling for diagnosis and baseline differences . Similarly , CR patients demonstrated greater improvements than TAU patients in psychosocial functioning irrespective of diagnosis . Delayed learning and memory improvements mediated the effect of treatment on psychosocial functioning at a marginal level . Conclusions CR improves memory and psychosocial outcome in first-episode psychiatric out- patients for both depression and psychosis . Memory potentially mediated the functional gains observed . Future studies need to build on the current findings in larger sample s using blinded allocation and should incorporate longitudinal follow-up and assessment of potential moderators ( e.g. social cognition , self-efficacy ) to examine sustainability and the precise mechanisms of CR effects respectively Rationale Cognitive impairments are important determinants of functional outcome in psychosis , which are inadequately treated by antipsychotic medication . Modafinil is a wake-promoting drug that has been shown to improve attention , memory and executive function in the healthy population and in patients with schizophrenia . Objectives We aim ed to establish modafinil ’s role in the adjunctive treatment of cognitive impairments in the first episode of psychosis , a time when symptoms may be more malleable than at chronic stages of the disease . Methods Forty patients with a first episode of psychosis participated in a r and omised , double-blind , placebo-controlled crossover design study assessing the effects of a single dose of 200 mg modafinil on measures of executive functioning , memory , learning , impulsivity and attention . Results Modafinil improved verbal working memory ( d = 0.24 , p = 0.04 ) , spatial working memory errors ( d = 0.30 , p = 0.0004 ) and strategy use ( d = 0.23 , p = 0.03 ) . It also reduced discrimination errors in a task testing impulsivity . Modafinil showed no effect on impulsivity measures , sustained attention , attentional set-shifting , learning or fluency . Conclusions Modafinil selectively enhances working memory in first episode psychosis patients , which could have downstream effects on patients ’ social and occupational functioning First-episode schizophrenia ( FES ) spectrum disorders are associated with pronounced cognitive dysfunction across all domains . However , less is known about the course of cognitive functioning , following the first presentation of psychosis , and the relationship of cognition to clinical course during initial treatment . The present longitudinal study examined the magnitude of neurocognitive impairment , using the MATRICS Consensus Cognitive Battery , in patients experiencing their first episode of psychosis at baseline and after 12 weeks of r and omized antipsychotic treatment with either aripiprazole or risperidone . At baseline , FES patients evidence d marked impairments in cognitive functioning . Notably , performance on the mazes task of planning and reasoning significantly predicted the likelihood of meeting stringent criteria for positive symptom remission during the first 12 weeks of the trial . Performance on indices of general cognitive function , working memory , and verbal learning improved over time , but these improvements were mediated by improvements in both positive and negative symptoms . We did not detect any differential effects of antipsychotic medication assignment ( aripiprazole vs risperidone ) on cognitive functioning . Our results suggest that a brief paper- and -pencil measure reflecting planning/reasoning abilities may index responsivity to antipsychotic medication . However , improvements in cognitive functioning over time were related to clinical symptom improvement , reflecting " pseudospecificity . BACKGROUND Disturbed functional connectivity is assumed to underlie neurocognitive deficits in patients with schizophrenia . As neurocognitive deficits are already present in the high-risk state , identification of the neural networks involved in this core feature of schizophrenia is essential to our underst and ing of the disorder . Resting-state studies enable such investigations , while at the same time avoiding the known confounder of impaired task performance in patients . The aim of the present study was to investigate EEG resting-state connectivity in high-risk individuals ( HR ) compared to first episode patients with schizophrenia ( SZ ) and to healthy controls ( HC ) , and its association with cognitive deficits . METHODS 64-channel resting-state EEG recordings ( eyes closed ) were obtained for 28 HR , 19 stable SZ , and 23 HC , matched for age , education , and parental education . The imaginary coherence-based multivariate interaction measure ( MIM ) was used as a measure of connectivity across 80 cortical regions and six frequency b and s. Mean connectivity at each region was compared across groups using the non-parametric r and omization approach . Additionally , the network-based statistic was applied to identify affected networks in patients . RESULTS SZ displayed increased theta-b and resting-state MIM connectivity across midline , sensorimotor , orbitofrontal regions and the left temporoparietal junction . HR displayed intermediate theta-b and connectivity patterns that did not differ from either SZ or HC . Mean theta-b and connectivity within the above network partially mediated verbal memory deficits in SZ and HR . CONCLUSIONS Aberrant theta-b and connectivity may represent a trait characteristic of schizophrenia associated with neurocognitive deficits . As such , it might constitute a promising target for novel treatment applications OBJECTIVE Cognitive deficits that characterize schizophrenia are present in the prodrome , worsen with illness onset , and predict functional outcome . Cognitive dysfunction is thus a critical target for early intervention in young individuals with recent onset schizophrenia . METHOD This 2-site double-blind r and omized controlled trial investigated cognitive training of auditory processing/verbal learning in 86 subjects with recent onset schizophrenia ( mean age of 21 years ) . Subjects were given laptop computers to take home and were asked to perform 40 hours of training or 40 hours of commercial computer games over 8 weeks . We examined cognitive measures recommended by the Measurement and Treatment Research to Improve Cognition in Schizophrenia initiative ( MATRICS ) , symptoms , and functioning . We also assessed baseline reward anticipation to index motivational system functioning and measured changes in auditory processing speed after 20 hours of training to assess target engagement . RESULTS Auditory training subjects demonstrated significant improvements in global cognition , verbal memory , and problem solving compared with those of computer games control subjects . Both groups showed a slight but significant decrease in symptoms and no change in functional outcome measures . Training-induced cognitive gains at posttraining showed significant associations with reward anticipation at baseline and with improvement in auditory processing speed at 20 hours . CONCLUSION Neuroscience-informed cognitive training via laptop computer represents a promising treatment approach for cognitive dysfunction in early schizophrenia . An individual 's baseline motivational system functioning ( reward anticipation ) , and ability to engage in auditory processing speed improvement , may represent important predictors of treatment outcome . Future studies must investigate whether cognitive training improves functioning and how best to integrate it into critical psychosocial interventions AIM To examine whether baseline neurocognition predicts vocational outcomes over 18 months in patients with first-episode psychosis enrolled in a r and omized controlled trial of Individual Placement and Support or treatment as usual . METHODS One-hundred and thirty-four first-episode psychosis participants completed an extensive neurocognitive battery . Principal axis factor analysis using PROMAX rotation was used to determine the underlying structure of the battery . Setwise ( hierarchical ) multiple linear and logistic regressions were used to examine predictors of ( 1 ) total hours employed over 18 months and ( 2 ) employment status , respectively . Neurocognition factors were entered in the models after accounting for age , gender , premorbid IQ , negative symptoms , treatment group allocation and employment status at baseline . RESULTS Five neurocognitive factors were extracted : ( 1 ) processing speed , ( 2 ) verbal learning and memory , ( 3 ) knowledge and reasoning , ( 4 ) attention and working memory and ( 5 ) visual organization and memory . Employment status over 18 months was not significantly predicted by any of the predictors in the final model . Total hours employed over 18 months were significantly predicted by gender ( P = .027 ) , negative symptoms ( P = .032 ) and verbal learning and memory ( P = .040 ) . Every step of the regression model was a significant predictor of total hours worked overall ( final model : P = .013 ) . CONCLUSION Verbal learning and memory , negative symptoms and gender were implicated in duration of employment in first-episode psychosis . The other neurocognitive domains did not significantly contribute to the prediction of vocational outcomes over 18 months . Interventions targeting verbal memory may improve vocational outcomes in early psychosis Background Verbal learning and memory are impaired not only in patients with a first episode of psychosis ( FEP ) but also – to a lower extent – in those with an at-risk mental state for psychosis ( ARMS ) . However , little is known about the specific nature of these impairments . Hence , we aim ed to study learning and memory processes in ARMS and FEP patients by making use of structural equation modelling . Methods Verbal learning was assessed with the California Verbal Learning Test ( CVLT ) in 98 FEP patients , 126 ARMS patients and 68 healthy controls ( HC ) as part of the Basel early detection of psychosis ( FePsy ) study . The four-factorial CFA model of Donders was used to estimate test performance on latent variables of the CVLT and growth curve analysis was used to model the learning curve . The latter allows disentangling initial recall , which is strongly determined by attentional processes , from the learning rate . Results The CFA model revealed that ARMS and FEP patients were impaired in Attention Span , Learning Efficiency and Delayed Memory and that FEP patients were additionally impaired in Inaccurate Memory . Additionally , ARMS-NT , but not ARMS-T , performed significantly worse than HC on Learning Efficiency . The growth curve model indicated that FEP patients were impaired in both initial recall and learning rate and that ARMS patients were only impaired in the learning rate . Conclusions Since impairments were more pronounced in the learning rate than the initial recall , our results suggest that the lower scores in the CVLT reported in previous studies are more strongly driven by impairments in the rate of learning than by attentional processes Cognitive deficits have an important role in the neurodevelopment of schizophrenia and other psychotic disorders . However , there is a continuing debate as to whether cognitive impairments in the psychosis prodrome are stable predictors of eventual psychosis or undergo a decline due to the onset of psychosis . In the present study , to determine how cognition changes as illness emerges , we examined baseline neurocognitive performance in a large sample of helping-seeking youth ranging in clinical state from low-risk for psychosis through individuals at clinical high-risk ( CHR ) for illness to early first-episode patients ( EFEP ) . At baseline , the MATRICS Cognitive Consensus battery was administered to 322 individuals ( 205 CHRs , 28 EFEPs , and 89 help-seeking controls , HSC ) that were part of the larger Early Detection , Intervention and Prevention of Psychosis Program study . CHR individuals were further divided into those who did ( CHR-T ; n = 12 , 6.8 % ) and did not ( CHR-NT , n = 163 ) convert to psychosis over follow-up ( Mean = 99.20 weeks , SD = 21.54 ) . ANCOVAs revealed that there were significant overall group differences ( CHR , EFEP , HSC ) in processing speed , verbal learning , and overall neurocognition , relative to healthy controls ( CNTL ) . In addition , the CHR-NTs performed similarly to the HSC group , with mild to moderate cognitive deficits relative to the CTRL group . The CHR-Ts mirrored the EFEP group , with large deficits in processing speed , working memory , attention/vigilance , and verbal learning ( > 1 SD below CNTLs ) . Interestingly , only verbal learning impairments predicted transition to psychosis , when adjusting for age , education , symptoms , antipsychotic medication , and neurocognitive performance in the other domains . Our findings suggest that large neurocognitive deficits are present prior to illness onset and represent vulnerability markers for psychosis . The results of this study further reinforce that verbal learning should be specifically targeted for preventive intervention for psychosis INTRODUCTION There is substantial evidence for Theory of Mind ( ToM ) deficits in patients with schizophrenia . Many psychotic symptoms may best be understood in light of an impaired capacity to infer one 's own and other persons ' mental states and to relate those to executing behavior . The aim of our study was to investigate ToM abilities in first-episode schizophrenia patients and to analyze them in relation to neuropsychological and psychopathological functioning . MATERIAL S AND METHODS A modified Moving Shapes paradigm was used to assess ToM abilities in 23 first-episode patients with schizophrenia and 23 matched healthy controls . Participants had to describe animated triangles which moved ( 1 ) r and omly , ( 2 ) goal -directed , or ( 3 ) in complex , socially interactive ways ( ToM video sequences ) . Neuropsychological functioning , psychopathology , autistic and alexithymic features as well as empathetic abilities were correlated with ToM performance . RESULTS Compared to healthy controls , first-episode schizophrenia patients gave more incorrect descriptions and used less ToM-related vocabulary when responding to socially complex ToM video sequences . No group differences were revealed for videos with r and om movements . ToM abilities correlated significantly with positive symptoms , reasoning , verbal memory performance and verbal IQ , but not with empathetic abilities or autistic and alexithymic features . When controlling for reasoning , verbal memory performance and verbal IQ , the correctness of video descriptions was still significantly worse in schizophrenia patients . DISCUSSION The results of our study in first-episode schizophrenia patients underline recent findings on ToM deficits in the early course of schizophrenia . Only a moderate influence of neurocognitive deficits on ToM performance was observed . Impairment in ToM abilities seems to be predominantly independent of clinical state , alexithymia and empathy OBJECTIVE To investigate the relationship between cognition and employment duration in first-episode psychosis ( FEP ) , and establish if a " fit " between cognition and job complexity is associated with longer employment duration . METHOD This study involved secondary data analysis of a sub sample of FEP individuals ( n = 65 ) who participated in a r and omized controlled trial comparing Individual Placement and Support plus treatment as usual ( TAU ) , versus TAU alone , over 6 months . A cognitive battery was administered at baseline and employment duration ( hours ) and job complexity in the longest held job over 6 months were measured . RESULTS Factor analysis with promax rotation of the cognitive battery revealed 4 cognitive domains : ( a ) attention and processing speed ; ( b ) verbal learning and memory ; ( c ) verbal comprehension and fluency ; and ( d ) visual organization and memory ( VO&M ) . The final hierarchical regression model found that VO&M and job complexity independently predicted employment duration in longest held job ; however , the " fit " ( or interaction ) between VO&M and job complexity was not significant . CONCLUSIONS AND IMPLICATION S FOR PRACTICE These findings suggest that VO&M and job complexity are important predictors of employment duration , but it is not necessary to ensure VO&M ability matches job complexity . However , there are limited comparative studies in this area , and other aspects of the person-organization fit perspective may still be useful to optimize vocational outcomes in FEP Twenty-three adolescents with psychotic disorders , aged from 13 to 18 years , participated in a 12-week open label trial ( 17 adolescents completed the study ) in order to examine the impact of quetiapine on clinical status and cognitive functions ( encompassing processing speed , attention , short-term memory , long-term memory and executive function ) . An improvement in Clinical Global Impression and Positive and Negative Symptom Scale ( P ’s ≤ 0.001 ) was observed . In addition , after controlling for amelioration of symptoms , a significant improvement was observed on one executive function ( P = 0.044 ; Trail Making Part B ) . The remaining cognitive abilities showed stability . In addition , we observed an interaction between quetiapine doses ( > 300 mg/day or < 300 mg/day ) and time , where lower doses showed more improvement in verbal short-term memory ( P = 0.048 ) , inhibition abilities ( P = 0.038 ) and positive symptoms ( P = 0.020 ) . The neuropsychological functioning of adolescents with psychotic disorders remained mainly stable after 12 weeks of treatment with quetiapine . However , lower doses seemed to have a better impact on two components of cognition ( inhibition abilities and verbal short-term memory ) and on positive symptoms AIMS Memory impairment in psychosis may be mediated through detrimental effects of hypothalamic-pituitary-adrenal ( HPA ) axis function . This study prospect ively investigated the relationship between cortisol , sulphate dehydroepi and rosterone ( DHEA(S ) and cortisol : DHEA(S ) ratio and memory in 35 first-episode psychosis ( FEP ) patients during the first 12 weeks of treatment and 23 healthy controls ( HC ) . METHODS Morning blood sampling and tests of attention , working memory and verbal memory occurred at baseline and 12-week follow-up . RESULTS FEP and HC groups did not significantly differ in levels of cortisol , DHEA(S ) or their ratio at baseline or over 12-weeks . The FEP group performed significantly below HC on all cognitive measures at baseline and over 12-weeks . Cortisol levels were unrelated to cognition in both groups . At baseline , DHEA(S ) was positively associated with attention in HCs , but negatively associated with attention in FEP participants . Change in DHEA(S ) was negatively associated with change in memory over 12-weeks in both groups . At 12-weeks , there was a negative correlation between the cortisol : DHEA(S ) ratio and attention in both groups . CONCLUSIONS These findings are mostly in contrast to findings in chronic schizophrenia . Investigation at different illness phases and over longer-follow-up periods is required to determine the complex relationship between HPA-axis and memory functioning in psychosis OBJECTIVE To examine the spontaneous blink rate over a 3-year period and its clinical and cognitive correlates among patients with first-episode schizophrenia . METHODS This study prospect ively followed 93 patients with first-episode schizophrenia , schizophreniform and schizoaffective disorders for 3 years . Patients were longitudinally assessed for blink rate , their positive and negative symptoms , and a range of cognitive features including verbal fluency , verbal memory , visual memory , and the Wisconsin Card Sorting Test performance . RESULTS . When compared with a matched control group , there was a significantly higher blink rate at their 3-year follow-up but not at initial presentation . The increase in blink rate over time correlated positively with the number of relapses . It also correlated with logical memory , verbal fluency , categories completed , and perseverative errors in the Wisconsin Card Sorting Test . The increased blink rate also correlated with pre-morbid schizoid and schizotypal traits . All these correlations were statistically significant . CONCLUSION The change in the blink rate over time may reflect underlying involvement of the dopaminergic system in mediating relapse and cognitive functions Prospect i ve memory ( PM ) is the ability to remember to carry out intended actions in the future . Empirical evidence suggests that PM deficits exist in individuals with chronic schizophrenia . However , it is unclear whether PM deficits in first-episode schizophrenia exist independently from other neuropsychological deficits . Moreover , prior research using patients with first-episode has been limited to small inpatient sample s. We aim ed to clarify the nature and extent of PM deficits in individuals with first-episode schizophrenia , using a large outpatient sample . Participants were 91 clinical ly stable out patients with first-episode schizophrenia and 83 healthy controls . PM was assessed using both a subjective self-reported checklist and a laboratory-based task capturing time- and event-based PM . A battery assessing verbal and visuo-spatial working memory , as well as executive functions was also administered . ANOVA analyses showed that patients with first-episode schizophrenia performed significantly poorer than healthy controls in time- and event-based PM . Stepwise linear regression analyses suggested that cognitive flexibility predicted time- and event-based PM ; and working memory predicted event-based PM . Subgroup analyses showed that " cognitive-preserved " patients with first-episode schizophrenia tended to perform poorer in time-based PM deficit than healthy controls who were matched in IQ and other neuropsychological functions . Overall , our results provide substantial evidence to support that time-based PM deficits in first-episode schizophrenia are apparent and not entirely attributable to other neuropsychological deficits . PM may constitute a neuropsychological marker for schizophrenia AIMS To examine whether baseline neurocognition and social cognition predict vocational outcomes over 6 months in patients with first-episode psychosis ( FEP ) enrolled in a r and omised controlled trial of Individual Placement and Support ( IPS ) versus treatment as usual ( TAU ) . METHODS 135 FEP participants ( IPS n=69 ; TAU n=66 ) completed a comprehensive neurocognitive and social cognitive battery . Principal axis factor analysis using PROMAX rotation was used to determine the underlying cognitive structure of the battery . Setwise ( hierarchical ) logistic and multivariate linear regressions were used to examine predictors of : ( a ) enrolment in education and employment ; and ( b ) hours of employment over 6 months . Neurocognition and social cognition factors were entered into the models after accounting for premorbid IQ , baseline functioning and treatment group . RESULTS Six cognitive factors were extracted : ( i ) social cognition ; ( ii ) information processing speed ; ( iii ) verbal learning and memory ; ( iv ) attention and working memory ; ( v ) visual organisation and memory ; and ( vi ) verbal comprehension . Enrolment in education over 6 months was predicted by enrolment in education at baseline ( p=.002 ) and poorer visual organisation and memory ( p=.024 ) . Employment over 6 months was predicted by employment at baseline ( p=.041 ) and receiving IPS ( p=.020 ) . Better visual organisation and memory predicted total hours of paid work over 6 months ( p<.001 ) . CONCLUSIONS Visual organisation and memory predicted the enrolment in education and duration of employment , after accounting for premorbid IQ , baseline functioning and treatment . Social cognition did not contribute to the prediction of vocational outcomes . Neurocognitive interventions may enhance employment duration in FEP Abstract Many modalities of cognition are affected in schizophrenia . The most common findings include dysfunctions of episodic and working memory and of executive functions . Although an inverse correlation between cortisol level and memory function has been proven , few studies have focused on the relationship between cortisol level and cognitive impairment in patients with schizophrenia . In an open , naturalistic , prospect i ve study , consecutively hospitalized males diagnosed with first-episode schizophrenia , hypothalamic – pituitary – adrenal axis activity ( afternoon cortisol levels , post-dexamethasone cortisol levels ) was evaluated before and at the end of acute treatment . Psychopathology was assessed using the positive and negative syndrome scale ( PANSS ) . Cognitive functions ( memory , attention , psychomotor , verbal fluency , and executive functions ) were tested after symptom alleviation using a neurocognitive test battery . In the total sample ( n = 23 ) , significant decreases in total PANSS score ( including all subscales ) , afternoon cortisol levels , and post-dexamethasone cortisol levels occurred during the course of treatment . It was found that higher afternoon cortisol levels at the beginning of treatment were significantly related to impaired performance in memory functions . Afternoon cortisol levels were not significantly associated with other measured cognitive functions . No correlation was discovered between cognitive functions and post-dexamethasone cortisol levels . The determination of afternoon cortisol levels may serve to detect potential c and i date s for specific cognitive intervention immediately after the first psychotic breakthrough Abstract Aim The aims of this study were to examine if people with first‐episode psychosis ( FEP ) are able to continue adhering to exercise after a supervised intervention and to explore if the benefits of exercise can be sustained . Methods Twenty‐eight persons with FEP took part in a 10‐week exercise intervention that provided each participant with twice‐weekly accompaniment to exercise activities of their own choice , of whom 20 were re‐assessed 6 months after the intervention . Long‐term adherence to exercise was assessed , and measures of psychiatric symptoms , physical health , neurocognition and social functioning were administered at baseline , post‐intervention and 6‐month follow‐up . Results During the supervised intervention , participants achieved 124.4 min of moderate‐to‐vigorous exercise per week . After 6 months , physical activity levels had decreased significantly ( P = 0.025 ) and only 55 % of participants had continued to exercise weekly . Repeated‐ measures analysis of variance found that the significant improvements in psychiatric symptoms and social functioning observed immediately after the intervention were maintained at 6 months ( P = 0.001 ) . However , post hoc analyses showed that symptomatic reductions were only maintained for those who continued to exercise , whereas symptom scores increased among those who had ceased exercising . Previously observed improvements in waist circumference and verbal memory were lost by 6 months . Conclusion Long‐term exercise participation is associated with significant benefits for symptoms , cognition and social functioning in FEP . However , adherence to unsupervised exercise is low . Future research should explore the effectiveness of ‘ step‐down ’ support following supervised interventions , and aim to establish sustainable methods for maintaining regular exercise in order to facilitate functional recovery and maintain physical health BACKGROUND Several sex differences in schizophrenia have been reported including differences in cognitive functioning . Studies with schizophrenia patients and healthy controls ( HC ) indicate that the sex advantage for women in verbal domains is also present in schizophrenia patients . However , findings have been inconsistent . No study focused on sex-related cognitive performance differences in at-risk mental state for psychosis ( ARMS ) individuals yet . Thus , the aim of the present study was to investigate sex differences in cognitive functioning in ARMS , first episode psychosis ( FEP ) and HC subjects . We expected a better verbal learning and memory performance of women in all groups . METHODS The neuropsychological data analysed in this study were collected within the prospect i ve Früherkennung von Psychosen ( FePsy ) study . In total , 118 ARMS , 88 FEP individuals and 86 HC completed a cognitive test battery covering the domains of executive functions , attention , working memory , verbal learning and memory , IQ and speed of processing . RESULTS Women performed better in verbal learning and memory regardless of diagnostic group . By contrast , men as compared to women showed a shorter reaction time during the working memory task across all groups . CONCLUSION The results provide evidence that women generally perform better in verbal learning and memory , independent of diagnostic group ( ARMS , FEP , HC ) . The finding of a shorter reaction time for men in the working memory task could indicate that men have a superior working memory performance since they responded faster during the target trials , while maintaining a comparable overall working memory performance level BACKGROUND Cognitive impairment is a core feature of schizophrenia . Its relationship with duration of untreated psychosis ( DUP ) , a potentially malleable prognostic factor , has been less studied , with inconsistent findings being observed in the literature . Previous research investigating such a relationship was mostly cross-sectional and none of those prospect i ve studies had a follow-up duration beyond 2 years . Method A total of 93 Hong Kong Chinese aged 18 to 55 years presenting with first-episode schizophrenia-spectrum disorder were studied . DUP and pre-morbid adjustment were measured using a structured interview incorporating multiple sources of information . Psychopathological evaluation was administered at intake , after clinical stabilization of the first psychotic episode , and at 12 , 24 and 36 months . Cognitive functions were measured at clinical stabilization , and at 12 , 24 and 36 months . RESULTS DUP exerted differential effects on various cognitive domains , with memory deficits being the most related to DUP even when potential confounders including pre-morbid adjustment and sex were adjusted . Prolonged DUP was associated with more severe impairment in visual memory at clinical stabilization and verbal memory at 24 and 36 months . Further , patients with a long DUP were found to have worse outcomes on negative symptoms at 36 months . The effects of DUP on verbal memory remained significant even when negative symptoms were taken into consideration . CONCLUSIONS Our findings provided further supportive evidence that delayed treatment to first-episode psychosis is associated with poorer cognitive and clinical outcomes . In addition , DUP may specifically affect memory function and its adverse impact on verbal memory may only become evident at a later stage of the recovery process OBJECTIVE Aggression , suicidality and involuntary treatment constitute severe clinical problems in first-episode psychosis ( FEP ) . Although there are studies on prevalence and clinical predictors of these conditions , little is known on the influence of psychopathology and neuropsychological dysfunction . METHOD 152 FEP in patients were prospect ively assessed using the Brief Psychiatric Rating Scale ( BPRS ) and a neuropsychological examination covering the domains ' processing speed ' , ' concentration and attention ' , ' executive function ' , ' working memory ' , ' verbal memory ' , ' verbal comprehension ' , ' logical reasoning ' , ' global cognition ' , and ' general intelligence ' . Clinical data were collected retrospectively in a structured file audit trial . RESULTS Patients were aged 24.5±4.9years , and 112 ( 74 % ) were male . At admission , 13 ( 9 % ) patients presented with severe aggression , and 28 ( 18 % ) with severe suicidality . 31 patients ( 20 % ) received involuntary treatment . In multivariate analyses , aggression was predicted by BPRS-Excited Component ( BPRS-EC ; p=.001 ) , suicidality was predicted by BPRS-EC ( p=.013 ) and general intelligence ( p=.016 ) , and predictors for involuntary treatment were BPRS-EC ( p=.001 ) and neuropsychological dysfunction in the domain ' concentration and attention ' ( p=.016 ) . CONCLUSION Psychopathology and neuropsychological functioning independently predict dangerous behavior in FEP patients . Some correlations with neuropsychology ( e.g. , of aggression with concentration/attention ) are absent in multivariate analyses and may thus constitute a proxy of psychopathological features . In addition to clinical data , BPRS-EC can be used as a predictor of dangerous behavior . Patients with severe aggression and suicidality show different patterns of neuropsychological dysfunction , indicating that suicidality should not be conceptualized as subtype of aggressive behavior PURPOSE The study examined the rate of remission in individuals experiencing a first episode of schizophrenia ( FES ) in China and explored predictors of remission in the acute phase of the illness . DESIGN AND METHODS Fifty-five FES patients were r and omly treated with risperidone , olanzapine , or aripiprazole at therapeutic doses for 8 weeks , and their clinical profiles and cognition were assessed using st and ardized assessment instruments at entry and the end of the study . FINDINGS Of the 55 patients , 30 ( 54.5 % ) remitted by the end of the 8-week study . In univariate analyses , shorter duration of untreated psychosis , higher scores on both the time-based prospect i ve memory ( TBPM ) and event-based prospect i ve memory tasks and the Hopkins Verbal Learning Test-revised , and less severe negative symptoms were significantly associated with remission . In stepwise multiple logistic regression analyses , only higher scores on the TBPM significantly predicted remission . Individuals having higher scores reflecting better TBPM at baseline were more likely to achieve remission after 8 weeks of optimized antipsychotic treatment . PRACTICE IMPLICATION S TPBM may be useful in helping clinicians identify those FES patients most likely to achieve a favorable treatment response BACKGROUND Although relapse in psychosis is common , a small proportion of patients will not relapse in the long term . We examined the proportion and predictors of patients who never relapsed in the 10 years following complete resolution of positive symptoms from their first psychotic episode . METHOD Patients who previously enrolled in a 12-month r and omized controlled trial on medication discontinuation and relapse following first-episode psychosis ( FEP ) were followed up after 10 years . Relapse of positive symptoms was operationalized as a change from a Clinical Global Impression scale positive score of <3 for at least 3 consecutive months to a score of ⩾3 ( mild or more severe ) . Baseline predictors included basic demographics , premorbid functioning , symptoms , functioning , and neurocognitive functioning . RESULTS Out of 178 first-episode patients , 37 ( 21 % ) never relapsed during the 10-year period . Univariate predictors ( p ⩽ 0.1 ) of patients who never relapsed included a duration of untreated psychosis ( DUP ) ⩽30 days , diagnosed with non-schizophrenia spectrum disorders , having less severe negative symptoms , and performing better in logical memory immediate recall and verbal fluency tests . A multivariate logistic regression analysis further suggested that the absence of any relapsing episodes was significantly related to better short-term verbal memory , shorter DUP , and non-schizophrenia spectrum disorders . CONCLUSIONS Treatment delay and neurocognitive function are potentially modifiable predictors of good long-term prognosis in FEP . These predictors are informative as they can be incorporated into an optimum risk prediction model in the future , which would help with clinical decision making regarding maintenance treatment in FEP ABSTRACT Introduction : The aim of the study was to eluci date the association between performance-related neurocognitive abilities and Theory of Mind ( ToM ) as measured by the Hinting Task ( HT ) performance and investigate the psychometric properties of the HT for use in First-Episode Psychosis ( FEP ) . Methods : Cross-sectional data of 132 participants with FEP , aged 15–25 years , enrolled in a r and omised controlled trial of vocational intervention , were analysed . A comprehensive cognitive battery including social cognitive and neurocognitive measures , a social and occupational functioning measure and psychopathological measures , were used . Psychometric properties were measured through bivariate correlations and associations with neurocognitive domains were assessed through hierarchical regression . Results : Low convergent validity of the HT with other ToM measures , moderate discriminant validity with an emotion recognition task , low predictive validity with social and occupational functioning , and high internal consistency were revealed . HT performance was significantly associated with verbal reasoning and verbal memory . Conclusion : Results provide preliminary evidence of low convergent validity and moderate discriminant validity of the HT in FEP , and the influence of verbal reasoning and verbal memory on HT performance , indicating that caution is warranted when employing the HT as a screening tool in isolation for detection of ToM deficits in FEP Grey matter ( GM ) volume alterations have been repeatedly demonstrated in patients with first episode psychosis ( FEP ) . Some of these neuroanatomical abnormalities are already evident in the at‐risk mental state ( ARMS ) for psychosis . Not only GM alterations but also neurocognitive impairments pre date the onset of frank psychosis with verbal learning and memory ( VLM ) being among the most impaired domains . Yet , their interconnection with alterations in GM volumes remains ambiguous . Thus , we evaluated associations of different subcortical GM volumes in the medial temporal lobe with VLM performance in antipsychotic‐naïve ARMS and FEP patients . Data from 59 ARMS and 31 FEP patients , collected within the prospect i ve Früherkennung von Psychosen study , were analysed . Structural T1‐weighted images were acquired using a 3 Tesla magnetic resonance imaging scanner . VLM was assessed using the California Verbal Learning Test and its factors Attention Span , Learning Efficiency , Delayed Memory and Inaccurate Memory . FEP patients showed significantly enlarged volumes of hippocampus , pallidum , putamen and thalamus compared to ARMS patients . A significant negative association between amygdala and pallidum volume and Attention Span was found in ARMS and FEP patients combined , which however did not withst and correction for multiple testing . Although we found significant between‐group differences in subcortical volumes and VLM is among the most impaired cognitive domains in emerging psychosis , we could not demonstrate an association between low performance and subcortical GM volumes alterations in antipsychotic‐naïve patients . Hence , deficits in this domain do not appear to stem from alterations in subcortical structures OBJECTIVE Schizophrenia is associated with a marked cognitive impairment that is widely believed to remain stable after illness onset . Yet , to date , 10-year prospect i ve studies of cognitive functioning following the first episode with good methodology are rare . The authors examined whether schizophrenia patients experience cognitive decline after the first episode , whether this decline is generalized or confined to individual neuropsychological functions , and whether decline is specific to schizophrenia . METHODS Participants were from a population -based case-control study of patients with first-episode psychosis who were followed prospect ively up to 10 years after first admission . A neuropsychological battery was administered at index presentation and at follow-up to patients with a diagnosis of schizophrenia ( N=65 ) or other psychoses ( N=41 ) as well as to healthy comparison subjects ( N=103 ) . RESULTS The schizophrenia group exhibited declines in IQ and in measures of verbal knowledge and of memory , but not processing speed or executive functions . Processing speed and executive function impairments were already present at the first episode and remained stable thereafter . The magnitude of declines ranged between 0.28 and 0.66 st and ard deviations . Decline in measures of memory was not specific to schizophrenia and was also apparent in the group of patients with other psychoses . Healthy individuals with low IQ showed no evidence of decline , suggesting that a decline is specific to psychosis . CONCLUSIONS Patients with schizophrenia and other psychoses experience cognitive decline after illness onset , but the magnitude of decline varies across cognitive functions . Distinct mechanisms consequent to the illness and /or psychosocial factors may underlie impairments across different cognitive functions Importance It remains uncertain whether people with psychotic disorders experience progressive cognitive decline or normal cognitive aging after first hospitalization . This information is essential for prognostication in clinical setting s , deployment of cognitive remediation , and public health policy . Objective To examine long-term cognitive changes in individuals with psychotic disorders and to compare age-related differences in cognitive performance between people with psychotic disorders and matched control individuals ( ie , individuals who had never had psychotic disorders ) . Design , Setting , and Participants The Suffolk County Mental Health Project is an inception cohort study of first-admission patients with psychosis . Cognitive functioning was assessed 2 and 20 years later . Patients were recruited from the 12 inpatient facilities of Suffolk County , New York . At year 20 , the control group was recruited by r and om digit dialing and matched to the clinical cohort on zip code and demographics . Data were collected between September 1991 and July 2015 . Analysis began January 2016 . Main Outcomes and Measures Change in cognitive functioning in 6 domains : verbal knowledge ( Wechsler Adult Intelligence Scale-Revised vocabulary test ) , verbal declarative memory ( Verbal Paired Associates test I and II ) , visual declarative memory ( Visual Reproduction test I and II ) , attention and processing speed ( Symbol Digit Modalities Test-written and oral ; Trail Making Test [TMT]-A ) , abstract ion-executive function ( Trenerry Stroop Color Word Test ; TMT-B ) , and verbal fluency ( Controlled Oral Word Association Test ) . Results A total of 705 participants were included in the analyses ( mean [ SD ] age at year 20 , 49.4 [ 10.1 ] years ) : 445 individuals ( 63.1 % ) had psychotic disorders ( 211 with schizophrenia spectrum [ 138 ( 65 % ) male ] ; 164 with affective psychoses [ 76 ( 46 % ) male ] ; 70 with other psychoses [ 43 ( 61 % ) male ] ) ; and 260 individuals ( 36.9 % ) in the control group ( 50.5 [ 9.0 ] years ; 134 [ 51.5 % ] male ) . Cognition in individuals with a psychotic disorder declined on all but 2 tests ( average decline : d = 0.31 ; range , 0.17 - 0.54 ; all P < .001 ) . Cognitive declines were associated with worsening vocational functioning ( Visual Reproduction test II : r = 0.20 ; Symbol Digit Modalities Test-written : r = 0.25 ; Stroop : r = 0.24 ; P < .009 ) and worsening negative symptoms ( avolition : Symbol Digit Modalities Test-written : r = -0.24 ; TMT-A : r = -0.21 ; Stroop : r = -0.21 ; all P < .009 ; inexpressivity : Stroop : r = -0.22 ; P < .009 ) . Compared with control individuals , people with psychotic disrders showed age-dependent deficits in verbal knowledge , fluency , and abstract ion-executive function ( vocabulary : β = -0.32 ; Controlled Oral Word Association Test : β = -0.32 ; TMT-B : β = 0.23 ; all P < .05 ) , with the largest gap among participants 50 years or older . Conclusions and Relevance In individuals with psychotic disorders , most cognitive functions declined over 2 decades after first hospitalization . Observed declines were clinical ly significant . Some declines were larger than expected due to normal aging , suggesting that cognitive aging in some domains may be accelerated in this population . If confirmed , these findings would highlight cognition as an important target for research and treatment during later phases of psychotic illness
11,005
22,489,622
No serious adverse events were reported with these combinations
Intubation without prior administration of muscle relaxants is a common practice in children . However , succinylcholine may be considered as the golden st and ard for optimizing intubating conditions . We conducted a systematic review of the literature to identify drug combinations that included induction of anesthesia with sevoflurane or propofol . Our aim was to select drug combinations that yield excellent intubating conditions ≥80 % ; we identified six combinations in children aged 1 - 9 years .
Summary We studied the intubating conditions , haemodynamic and endocrine changes following tracheal intubation during sevoflurane anaesthesia guided by Bispectral Index ( BIS ) monitoring in 40 children who received either remifentanil 1 µg.kg−1 ( group R ) or saline 1 ml.kg−1 ( group S ) . Acceptable intubating conditions were found in all patients in group R ( n = 20 ) , compared to only 12 patients in group S ( p = 0.002 ) . There were no intergroup differences in heart rate , systolic blood pressure and plasma concentrations of epinephrine and norepinephrine at any time point and changes in haemodynamic variables throughout the study period were moderate . Titration of sevoflurane delivery to a target BIS of 35 ± 5 led to almost equal end‐tidal sevoflurane concentrations in either group and remifentanil did not affect the BIS . There were no side‐effects in either group that required intervention . Intubating conditions during sevoflurane anaesthesia in children were found to be improved by a single bolus dose of remifentanil 1 µg.kg−1 BACKGROUND We studied 120 children aged 2 - 7 yr in a prospect i ve , r and omized , assessor-blinded fashion to define the optimal rocuronium dose which provides a 95 % probability of acceptable intubation conditions ( ED95TI ) during inhalation induction with sevoflurane . METHODS After inhalation induction with 8 % sevoflurane in 60 % nitrous oxide and 40 % oxygen , and loss of the eyelash reflex , we administered rocuronium ( 0.1 , 0.15 , 0.22 , 0.3 , or 0.6 mg kg-1 ) or placebo . We quantified neuromuscular function by stimulation of the ulnar nerve at 0.1 Hz to produce contraction of the adductor pollicis muscle using accelerometry . Intubation conditions were assessed 2 min after test drug injection . The optimal rocuronium dose was defined as the lowest dose , which allowed acceptable intubation conditions in 95 % of children ( ED95TI ) . RESULTS Two minutes after injection of placebo or rocuronium , intubation conditions were acceptable in 35 , 45 , 80 , 90 , 95 , and 100 % of children , respectively . Rocuronium 0.07 [ CI 0.02 - 0.11 ] , 0.24 [ 0.19 - 0.31 ] , and 0.29 [ 0.23 - 0.38 ] mg kg-1 provided 50 , 90 , and 95 % probability of acceptable intubating conditions . When thumb acceleration was depressed by 50 % or more , intubating conditions were considered acceptable in 97 % of children . Recovery of the train-of-four ratio to 0.8 averaged 12 ( 7 ) , 16 ( 7 ) , 24 ( 7 ) , 24 ( 8) , and 50 ( 22 ) min after the respective dose of rocuronium . CONCLUSIONS During inhalation induction with 8 % sevoflurane in 60 % nitrous oxide , rocuronium 0.29 mg kg-1 ( ED95 ) optimizes intubation conditions for surgery of short duration BACKGROUND : Intubation without the use of muscle relaxants in children is frequently done before IV access is secured . In this r and omized controlled trial , we compared intubating conditions and airway response to intubation ( coughing and /or movement ) after sevoflurane induction in children at 2 and 3 min after the administration of intranasal remifentanil ( 4 mcg/kg ) or saline . METHODS : One hundred eighty-eight children , 1–7-yr old , were studied . Nasal remifentanil ( 4 mcg/kg ) or saline was administered 1 min after an 8 % sevoflurane N2O induction . The sevoflurane concentration was then reduced to 5 % in oxygen , and ventilation assisted/controlled . An anesthesiologist blinded to treatment assignment used a vali date d score to evaluate the conditions for laryngoscopy and response to intubation . Blood sample s for determination of remifentanil blood concentrations were collected from 17 children at baseline , 2 , 3 , 4 , and 10 min after nasal administration of remifentanil . RESULTS : Good or excellent intubating conditions were achieved at 2 min ( after the remifentanil bolus ) in 68.2 % and at 3 min in 91.7 % of the children who received intranasal remifentanil versus 37 % and 23 % in children who received placebo ( P < 0.01 ) . The mean remifentanil plasma concentrations ( ±sd ) at 2 , 3 , 4 , and 10 min were 1.0 ( 0.60 ) , 1.47 ( 0.52 ) , 1.70 ( 0.46 ) , and 1.16 ( 0.36 ) ng/mL , respectively . Peak plasma concentration was observed at 3.47 min . There were no complications associated with the use of nasal remifentanil . CONCLUSIONS : Nasal administration of remifentanil produces good-to-excellent intubating conditions in 2–3 min after sevoflurane induction of anesthesia Tracheal intubating conditions were assessed in 112 children after induction of anaesthesia with propofol and remifentanil 1.0 , 2.0 or 3.0 µg.kg−1 . Subjects in a control group were given propofol and mivacurium 0.2 mg.kg−1 . Haemodynamic and respiratory parameters were recorded . Plasma catecholamine levels were measured in a subgroup of 40 children . Intubating conditions were acceptable in 14/28 ( 50 % ) , 18/26 ( 69 % ) and 22/27 ( 82 % ) in those subjects given remifentanil 1.0 , 2.0 or 3.0 µg.kg−1 , respectively , and in 27/28 ( 96 % ) of the control group . Intubating conditions in subjects given remifentanil 3.0 µg.kg−1 were better than in those given remifentanil 1.0 µg.kg−1 ( p < 0.05 ) . There were no significant differences in intubating conditions between those given remifentanil 3.0 µg.kg−1 and the control group . Systolic blood pressure and heart rate increased in response to tracheal intubation in subjects given remifentanil 1.0 µg.kg−1 and in the control group ( p < 0.05 ) . Time to resumption of spontaneous respiration was prolonged in subjects given remifentanil 3.0 µg.kg−1 ( p < 0.001 ) . In conclusion , remifentanil 2 µg.kg−1 provides acceptable intubating conditions and haemodynamic stability without prolonging the return of spontaneous respiration Background : Sevoflurane inhalation induction of anaesthesia is widely used in paediatric anaesthesia . We have found that this method is frequently associated with epileptiform electroencephalogram ( EEG ) in adults , especially if controlled hyperventilation is used IMPLICATION S Supplementing a sevoflurane induction of anesthesia in children with IV lidocaine 2 mg/kg can suppress cough after tracheal intubation and thus improve intubating conditions . In addition , lidocaine minimizes blood pressure fluctuations after tracheal intubation BACKGROUND Remifentanil is known to cause bradycardia and hypotension . We aim ed to characterize the haemodynamic profile of remifentanil during sevoflurane anaesthesia in children with or without atropine . METHODS Forty children who required elective surgery received inhalational induction of anaesthesia using 8 % sevoflurane . They were allocated r and omly to receive either atropine , 20 microg kg(-1 ) ( atropine group ) or Ringer 's lactate ( control group ) after 10 min of steady-state 1 MAC sevoflurane anaesthesia ( baseline ) . Three minutes later ( T0 ) , all children received remifentanil 1 microg kg(-1 ) injected over a 60 s period , followed by an infusion of 0.25 microg kg(-1 ) min(-1 ) for 10 min then 0.5 microg kg(-1 ) min(-1 ) for 10 min . Haemodynamic variables and echocardiographic data were determined at baseline , T0 , T5 , T10 , T15 and T20 min . RESULTS Remifentanil caused a significant decrease in heart rate compared with the T0 value , which was greater at T20 than T10 in the two groups : however , the values at T10 and T20 were not significantly different from baseline in the atropine group . In comparison with T0 , there was a significant fall in blood pressure in the two groups . Remifentanil caused a significant decrease in the cardiac index with or without atropine . Remifentanil did not cause variation in stroke volume ( SV ) . In both groups , a significant increase in systemic vascular resistance occurred after administration of remifentanil . Contractility decreased significantly in the two groups , but this decrease remained moderate ( between -2 and + 2 sd ) . CONCLUSION Remifentanil produced a fall in blood pressure and cardiac index , mainly as a result of a fall in heart rate . Although atropine was able to reduce the fall in heart rate , it did not completely prevent the reduction in cardiac index BACKGROUND The aim of our study was to determine the optimal dose of propofol preceded by fentanyl for successful tracheal intubation and to see its effectiveness in blunting pressor response in children aged 3 - 10 years . METHODS This prospect i ve , double blind , r and omized study was conducted on 60 ASA grade I and II children , between 3 and 10 years undergoing elective surgery who were divided into three groups of 20 each . The children received different doses of propofol ( group I , 2.5 mg x kg(-1 ) ; group II , 3.0 mg x kg(-1 ) ; group III , 3.5 mg x kg(-1 ) ) preceded by a fixed dose of fentanyl ( 3.0 microg x kg(-1 ) ) 3 min earlier . The tracheal intubating conditions were grade d based on scoring system devised by Helbo-Hensen et al. with Steyn modification which includes five criteria ; ease of laryngoscopy , degree of coughing , position of vocal cords , jaw relaxation , and limb movement and grade d on a 4-point scale . Heart rate ( HR ) , mean arterial pressure ( MAP ) , and oxygen saturation changes were also noted . RESULTS Tracheal intubating conditions were acceptable in 25 % of the patients in group I , while significantly higher ( P < 0.001 ) in group II ( 80 % ) and in group III ( 90 % ) . The pressor response was not effectively blunted in group I ( 17 % increase in HR ) , while effectively blunted in groups II and III . A fall in cardiac output was seen in group III indicated by a decrease in MAP ( 16 % ) and HR ( 11 % ) . No airway complications were noted . CONCLUSIONS Propofol 3 mg x kg(-1 ) ( group II ) preceded by fentanyl 3 microg x kg(-1 ) is the optimal dose combination in our study . It provides acceptable intubating conditions in 80 % patients , blunts pressor response to intubation without significant cardiovascular depression BACKGROUND Drug effect lags behind the blood concentration . The goal of this investigation was to determine the time course of plasma concentration and the effects of propofol demonstrated by electroencephalogram or blood pressure changes and to compare them between elderly and young or middle-aged patients . METHODS A target-controlled infusion was used to rapidly attain and maintain four sequentially increasing , r and omly selected plasma propofol concentrations from 1 to 12 microg/ml in 41 patients aged 20 - 85 yr . The target concentration was maintained for about 30 min . Bispectral index ( BIS ) , spectral edge frequency , and systolic blood pressure ( SBP ) were used as measures of propofol effect . Because the time courses of these measures following the started drug infusion showed an exponential pattern , the first-order rate constant for equilibration of the effect site with the plasma concentration ( k(eO ) ) was estimated by fitting a monoexponential model to the effect versus time data result ing from the pseudo-steady-state propofol plasma concentration profile . RESULTS The half-times for the plasma-effect-site equilibration for BIS were 2.31 , 2.30 , 2.29 , and 2.37 min in patients aged 20 - 39 , 40 - 59 , 60 - 69 , and 70 - 85 yr , respectively ( n = 10 or 11 each ) . The half-times for SBP were 5.68 , 5.92 , 8.87 , and 10.22 min in the respective age groups . All were significantly longer than for BIS ( P < 0.05 ) . The propofol concentration at half of the maximal decrease of SBP was significantly greater ( P < 0.05 ) in the elderly than in the younger patients . CONCLUSIONS The effect of propofol on BIS occurs more rapidly than its effect on SBP . Age has no effect on the rate of BIS reduction with increasing propofol concentration , whereas with increasing age , SBP decreases to a greater degree but more slowly Objective To evaluate and compare the efficacy , infusion rate and recovery profile of vecuronium and cisatracurium continuous infusion in critically ill children requiring mechanical ventilation . Design and setting Prospect i ve , r and omised , double-blind , single-centre study in critically ill children in a paediatric intensive care unit in a tertiary children ’s hospital . Methods Thirty-seven children from 3 months to 16 years old ( median 4.1 year ) were r and omised to receive either drug ; those already receiving more than 6 h of neuromuscular blocking drugs were excluded . The Train-of-Four ( TOF ) Watch maintained neuromuscular blockade to at least one twitch in the TOF response . Recovery time was measured from cessation of infusion until spontaneous TOF ratio recovery of 70 % . Results The cisatracurium infusion rate in nineteen children averaged 3.9±1.3 µg kg−1 min−1 with a median duration of 63 h ( IQR 23–88 ) . The vecuronium infusion rate in 18 children averaged mean 2.6±1.3 µg kg−1 min−1 with a median duration of 40 h ( IQR 27–72 ) . Median time to recovery was significantly shorter with cisatracurium ( 52 min , 35–73 ) than with vecuronium ( 123 min , 80–480 ) . Prolonged recovery of neuromuscular function ( > 24 h ) occurred in one child ( 6 % ) on vecuronium . Conclusions Recovery of neuromuscular function after discontinuation of neuromuscular blocking drug infusion in children is significantly faster with cisatracurium than vecuronium . Neuromuscular monitoring was not sufficient to eliminate prolonged recovery in children on vecuronium infusions BACKGROUND The induction characteristics of propofol 1 % and 2 % were compared in children undergoing ENT surgery , in a prospect i ve , r and omized , double-blind study . METHODS One hundred and eight children received propofol 1 % ( n=55 ) or 2 % ( n=53 ) for induction and maintenance of anaesthesia . For induction , propofol 4 mg kg(-1 ) was injected at a constant rate ( 1200 ml h(-1 ) ) , supplemented with alfentanil . Intubating conditions without the use of a neuromuscular blocking agent were scored . RESULTS Pain on injection occurred in 9 % and 21 % of patients after propofol 1 % and 2 % , respectively ( P=0.09 ) . Loss of consciousness was more rapid with propofol 2 % compared with propofol 1 % ( 47 s vs 54 s ; P=0.02 ) . Spontaneous movements during induction occurred in 22 % and 34 % ( P=0.18 ) , and intubating conditions were satisfactory in 87 % and 96 % ( P=0.19 ) of children receiving propofol 1 % or 2 % , respectively . There were no differences between the two groups in respect of haemodynamic changes or adverse events . CONCLUSIONS For the end-points tested , propofol 1 % and propofol 2 % are similar for induction of anaesthesia in children undergoing minor ENT surgery Purpose : To compare the intubating conditions after remifentanil-propofol with those after propofol-rocuronium combination with the aim of determining the optimal dose of remifentanil . Methods : In a r and omized , double-blind study 80 healthy children aged three to nine years were assigned to one of four groups ( n=20 ) : 2 or 4 µg·kg−1 remifentanil ( Re2 or Re4 ) ; 2 µg·kg−1 remifentanil and 0.2 mg·kg−1 rocuronium ( Re2-Ro0.2 ) ; 0.4 mg·kg−1 rocuronium ( Ro0.4 ) . After atropine , remifentanil was injected over 30 sec followed by 3.5 mg·kg−1 propofol and rocuronium . After 60 sec , laryngoscopy and intubation were attempted . Intubating conditions were assessed as excellent , good or poor based on ease of ventilation , jaw relaxation , position of the vocal cords , and coughing to intubation . Results : In all children intubation was successful . Overall intubating conditions were better ( P<0.01 ) , and the frequency of excellent conditions , 85 % , was higher ( P<0.01 ) in the Re4 group than in the Ro0.4 group . No child manifested signs of muscular rigidity . In the remifentanil groups , arterial pressure decreased 11–13 % and heart rate 6–9 % after anesthetic induction , and remained at that level throughout the study . Conclusion : The best intubating conditions were produced by the combination of 4 µg·kg−1 remifentanil and 3.5 mg·kg−1 propofol . It provided excellent or good intubating conditions in all children without causing undue cardiovascular depression . RésuméObjectif : Comparer les conditions d’intubation après l’usage d’une combinaison de rémifentanil-propofol avec celles d’une combinaison de propofol-rocuronium dans le but de déterminer la dose optimale de rémifentanil . Méthode : Lors d’une étude r and omisée et à double insu , 80 enfants en bonne santé , de trois à neuf ans , ont été répartis en quatre groupes ( n=20 ) et ont reçu : 2 ou 4 µg·kg−1 de rémifentanil ( Ré2 ou Ré4 ) ; 2 µg·kg−1 de rémifentanil et 0,2 mg·kg−1 de rocuronium ( Ré2-Ro0,2 ) ; 0,4 mg·kg−1 de rocuronium . Après l’administration d’atropine , le rémifentanil a été injecté pendant 30 s et a été suivi de 3,5 mg·kg−1 de propofol et de rocuronium . La laryngoscopie et l’intubation ont été tentées après 60 s. Les conditions d’intubation ont été évaluées comme excellentes , bonnes ou pavures selon la facilité de la ventilation , la relaxation de la mâchoire , la position des cordes vocales et la toux pendant l’intubation . Résultats : L’intubation a été réussie chez tous les enfants . Les conditions générales d’intubation ont été meilleures ( P<0,01 ) , et la fréquence d’excellentes conditions , 85 % , plus élevée ( P<0,01 ) dans le groupe Ré4 que dans le groupe Ro0,4 . Aucun enfant n’a manifesté de signe de rigidité musculaire . Dans les groupes rémifentanil , la tension artérielle a baissé de 11–13 % et la fréquence cardiaque de 6–9 % après l’induction de l’anesthésie et sont demeurées à ce niveau tout au long de l’étude . Conclusion : Les meilleures conditions d’intubation ont été réalisées avec la combinaison de 4 µg·kg−1 de rémifentanil et de 3,5 mg·kg−1 de propofol . L’intubation a été bonne ou excellente chez tous les enfants sans causer de dépression cardiovasculaire indue BACKGROUND The aim of this study was to compare intubating conditions and adverse events after sevoflurane induction in infants , with or without the use of rocuronium or alfentanil . METHODS Seventy-five infants , aged 1 - 24 months , undergoing elective surgery under general anaesthesia were r and omly assigned to receive 8 % sevoflurane with either placebo ( i.v . saline 0.5 ml kg⁻¹ ) , rocuronium ( 0.3 mg kg⁻¹ ) , or alfentanil ( 20 µg kg⁻¹ ) . The primary outcome measure was intubating conditions evaluated 90 s after test drug injection by an anaesthetist unaware of the patient 's group . The secondary outcome criteria were respiratory ( Sp(O₂ ) < 90 % , laryngospasm , closed vocal cords preventing intubation , bronchospasm ) and haemodynamic adverse events ( heart rate and mean arterial pressure variations ≥30 % control value ) . RESULTS Intubating conditions were significantly better in the rocuronium group , with clinical ly acceptable intubating conditions in 92 % , vs 70 % in the alfentanil group and 63 % in the placebo group ( P=0.044 ) . Adverse respiratory events were significantly less frequent in the rocuronium group : 0 % vs 33 % in the placebo group and 30 % in the alfentanil group ( P=0.006 ) . Haemodynamic adverse events were more frequent in the alfentanil group : 48 % vs 7 % in the placebo group and 16 % in the rocuronium group ( P=0.0019 ) . CONCLUSIONS In 1- to 24-month-old infants , the addition of 0.3 mg kg⁻¹ rocuronium to 8 % sevoflurane improved intubating conditions and decreased the frequency of respiratory adverse events . Alfentanil provided no additional benefit in this study The use of suxamethonium in children is associated with undesirable side effects . The synergistic effect of a rocuronium-mivacurium combination can be considered as an acceptable alternative to suxamethonium in clinical practice . The calculated ED50 of the rocuronium-mivacurium mixture was only 62 % of the predicted value assuming a purely additive interaction . The use of this combination has not been evaluated in children . In this two-part study , we assessed the intubating conditions and pharmacodynamics of suxamethonium , rocuronium , mivacurium or a rocuronium-mivacurium combinations in children . We studied 120 ASA I children of both sexes , aged 3 - 10 yr . Children were premedicated with trimeprazine 2 mg kg-1 orally , and received fentanyl 2 micrograms kg-1 and propofol 2 mg kg-1 for induction of anaesthesia . They were allocated r and omly to receive one of the following drugs or drug combinations : suxamethonium 1.0 mg kg-1 , mivacurium 0.2 mg kg-1 , rocuronium 0.6 or 0.9 mg kg-1 , mivacurium 0.1 mg kg-1 with rocuronium 0.3 mg kg-1 or mivacurium 0.15 mg kg-1 with rocuronium 0.45 mg kg-1 . In part 1 , 60 s after administration of the neuromuscular blocking drug or drug combination , tracheal intubation was performed in 60 children by mimicking rapid sequence induction , and intubating conditions were evaluated by a blinded investigator according to a st and ard score . In part 2 , neuromuscular monitoring was established before administration of neuromuscular blocking agent(s ) and the time from injection of drug or drug combination until complete ablation of T1 ( onset ) and recovery of T1 to 25 % ( duration ) were recorded in another 60 children . The frequency of distribution of excellent or good intubating conditions in the higher dose of rocuronium and the combination groups were similar to those in the suxamethonium group , but significantly different ( P < 0.05 ) from those in the mivacurium group . Mean onset time was faster in the suxamethonium ( 55.1 ( SD 11.4 ) s ) , rocuronium 0.9 mg kg-1 ( 70.5 ( 37.7 ) s ) , mivacurium 0.1 mg kg-1 with rocuronium 0.3 mg kg-1 ( 67 ( 35.9 ) s ) and mivacurium 0.15 mg kg-1 with rocuronium 0.45 mg kg-1 ( 55 ( 26.7 ) s ) groups compared with the mivacurium 0.2 mg kg-1 ( 116 ( 26.8 ) s ) and rocuronium 0.6 mg kg-1 ( 97.9 ( 29 ) s ) groups . This study demonstrated that the combination of rocuronium 0.45 mg kg-1 and mivacurium 0.15 mg kg-1 could possibly be considered as an acceptable alternative to suxamethonium when rapid sequence induction of anaesthesia is indicated in children because it provides uniform excellent intubating conditions and complete neuromuscular block in < 60 We have studied intubating conditions in 64 healthy children , aged 3 - 10 yr , undergoing adenotonsillectomy , in a double-blind , r and omized study . Intubation was performed 150 s after induction using either 8 % sevoflurane in nitrous oxide and oxygen or propofol 3 - 4 mg kg-1 with succinylcholine 2 mg kg-1 . An anaesthetist blinded to the technique performed intubation and scored intubating conditions using Krieg and Copenhagen Consensus Conference ( CCC ) scores . The trachea was intubated successfully at the first attempt in all patients under clinical ly acceptable conditions , although scores were significantly better with propofol and succinylcholine . The sevoflurane technique cost 3.62 + /- 0.55 Pounds to completion of tracheal intubation , significantly more ( P < 0.001 ) than the cost of propofol-succinylcholine and isoflurane ( 2.04 + /- 0.54 Pounds ) when based on actual amount of drug used . This cost increased to 4.38 + /- 0.05 Pounds when based on whole ampoules , which is significantly more than the cost of sevoflurane ( P < 0.001 ) Background and objective Situations may occur in anaesthetic practice where the use of neuromuscular blocking drugs is unsuitable or contraindicated . We investigated the use of propofol given 5 min after fentanyl to permit endotracheal intubation in children . Methods We studied the intubating conditions and cardiovascular parameters in 60 ASA I and II children . Intravenous midazolam ( 0.1 mg kg−1 ) was given as premedication 5 min before the induction of anaesthesia . The children received different doses of propofol ( group I , 2.5 mg kg−1 ; group II , 3.0 mg kg−1 ; group III , 3.5 mg kg−1 ) preceded by fentanyl ( 3.0 µg kg−1 ) given 5 min earlier . No neuromuscular blocking agents were administered . The intubating conditions were assessed using a four-point scoring system based on the degree of difficulty of laryngoscopy , the position of vocal cords and the intensity of coughing . Results Tracheal intubating conditions were adequate in 20 % of the patients in group I , in 75 % of the patients in group II and in 80 % of the patients in group III ( P < 0.05 for group I vs. groups II and III ) . Haemodynamic changes were not significantly different between the groups . Conclusions Propofol ( 3.0 mg kg−1 ) preceded by fentanyl ( 3.0 µg kg−1 ) was adequate for the induction of anaesthesia in children and provided adequate tracheal intubating conditions without significant haemodynamic changes . This method represents a useful alternative technique for tracheal intubation when neuromuscular blocking drugs are contraindicated or should be avoided We have studied 80 healthy children , aged 2 - 14 yr , undergoing adenotonsillectomy in a double-blind , r and omized design . Tracheal intubation facilitated by either suxamethonium 1.5 mg kg-1 or alfentanil 15 micrograms kg-1 was compared after induction of anaesthesia with propofol 3 - 4 mg kg-1 . The quality of tracheal intubation was grade d according to the ease of laryngoscopy , position of the vocal cords , coughing , jaw relaxation and movement of limbs . There were no significant differences in the overall assessment of intubating conditions between the two groups , and all children underwent successful tracheal intubation . Fewer patients coughed ( P < 0.014 ) and limb movement was less common ( P < 0.007 ) after tracheal intubation facilitated by suxamethonium . Alfentanil attenuated the haemodynamic responses to tracheal intubation Background : The effects of anesthetics on airway protective reflexes have not been extensively characterized in children . The aim of this study was to compare the laryngeal reflex responses in children anesthetized with either sevoflurane or propofol under two levels of hypnosis using the Bispectral Index score ( BIS ) . The authors hypothesized that the incidence of apnea with laryngospasm evoked by laryngeal stimulation would not differ between sevoflurane and propofol when used in equipotent doses and that laryngeal responsiveness would be diminished with increased levels of hypnosis . Methods : Seventy children , aged 2–6 yr , scheduled to undergo elective surgery were r and omly allocated to undergo propofol or sevoflurane anesthesia while breathing spontaneously through a laryngeal mask airway . Anesthesia was titrated to achieve the assigned level of hypnosis ( BIS 40 ± 5 or BIS 60 ± 5 ) in r and om order . Laryngeal and respiratory responses were elicited by spraying distilled water on the laryngeal mucosa , and a blinded review er assessed evoked responses . Results : Apnea with laryngospasm occurred more often during anesthesia with sevoflurane compared with propofol independent of the level of hypnosis : episodes lasting longer than 5 s , 34 % versus 19 % at BIS 40 and 34 % versus 16 % at BIS 60 ; episodes lasting longer than 10 s , 26 % versus 10 % at BIS 40 and 26 % versus 6 % at BIS 60 ( group differences P < 0.04 and P < 0.01 , respectively ) . In contrast , cough and expiration reflex occurred significantly more frequently in children anesthetized with propofol . Conclusion : Laryngeal and respiratory reflex responses in children aged 2–6 yr were different between sevoflurane and propofol independent of the levels of hypnosis examined in this study This r and omized blinded study tested the hypothesis that equipotent doses of vecuronium and mivacurium given in combination could achieve onset times to 90 % neuromuscular block ( B90 ) and intubation scores similar to succinylcholine . Thirty children were r and omly assigned to one of three groups as follows . Group Sux received a single dose ( 1 mg.kg-1 ) of succinylcholine followed by normal saline . Group V1M1 received 0.08 mg.kg-1 of vecuronium followed by 0.1 mg.kg-1 of mivacurium . Group V2M2 received 0.16 mg.kg-1 of vecuronium followed by 0.2 mg.kg-1 of mivacurium . Anaesthesia consisted of propofol , fentanyl , and nitrous oxide . Neuromuscular response was monitored by adductor pollicis electromyography ( Date x NMT ) . Sixty s after administration of the first injection , the laryngoscopy began , with the anaesthesiologist scoring the ease of intubation on a four category scale as excellent , good , poor , or inadequate . Time from injection to B90 was 39 (2.6)s after succinylcholine , which was not significantly different from 48 (3.5)s after vecuronium 0.16 mg.kg-1 and mivacurium 0.2 mg.kg-1 ( V2M2 ) . Mean time to B90 for group V1M1 was 64 (4.7)s , which was significantly different from that in group Sux . The intubation score was ' excellent ' for all patients in groups Sux and V2M2 and for only seven of ten patients in group V1M1 . Only combination of vecuronium ( 0.16 mg.kg-1 ) and mivacurium ( 0.2 mg.kg-1 ) provided rapid onset of neuromuscular blockade and excellent intubating conditions comparable to succinylcholine 1 mg.kg-1 . This combination did result in prolonged recovery times Background and objective Remifentanil and propofol have been proposed for intubation without muscle relaxant to avoid the adverse effects of muscle relaxants in children . We hypothesized that the addition of ketamine to remifentanil and propofol would improve intubating conditions and provide haemodynamic stability . Methods We studied 88 children ( 3–12 years ) undergoing elective surgery . Group K received ketamine 0.5 mg kg−1 , remifentanil 3 μg kg−1 and propofol 3 mg kg−1 . Group C received isotonic saline instead of ketamine , all other study drugs were same as in group K. Sixty seconds after administration of propofol , laryngoscopy and tracheal intubation were performed . Intubating conditions were grade d. Mean arterial pressure ( MAP ) , heart rate ( HR ) and SpO2 were recorded . Results The intubating conditions were regarded as clinical ly acceptable in 39 out of 44 ( 89 % ) children in group K and in 36 out of 44 ( 82 % ) children in group C. Although there was no failed intubation in group K , the intubation failed in six children in group C ( P < 0.05 ) . Tracheal intubation failed in 4/6 children because of severe coughing and /or limb movement , and in 2/6 children because of closed vocal cords . Scores for limb movement were significantly lower in group K than in group C. When compared with baseline , HR and MAP significantly decreased in both groups during the study ( P < 0.05 ) . Conclusion The addition of ketamine to remifentanil and propofol prevented failed intubation and slightly increased the percentage of acceptable intubating conditions . Ketamine had no influence on haemodynamic changes following remifentanil and propofol administration in given doses BACKGROUND Tracheal intubation in children can be achieved by deep inhalational anaesthesia or an intravenous anaesthetic and a muscle relaxant , suxamethonium being widely used despite several side-effects . Studies have shown that oral intubation can be facilitated safely and effectively in children after induction of anaesthesia with propofol and alfentanil without a muscle relaxant . Remifentanil is a new , ultra-short acting , selective mu-receptor agonist that is 20 - 30 times more potent than alfentanil . This clinical study was design ed to assess whether combination of propofol and remifentanil could be used without a muscle relaxant to facilitate tracheal intubation in children . METHODS Forty children ( 5 - 10 years ) admitted for adenotonsillectomy were r and omly allocated to one of two groups to receive remifentanil 2 microg.kg(-1 ) ( Gp I ) or remifentanil 3 microg.kg(-1 ) ( Gp II ) before the induction of anaesthesia with i.v . propofol 3 mg.kg(-1 ) . No neuromuscular blocking agent was administered . Intubating conditions were assessed using a four-point scoring system based on ease of laryngoscopy , jaw relaxation , position of vocal cords , degree of coughing and limb movement . Mean arterial pressure ( MAP ) and heart rate ( HR ) measured noninvasively before induction of anaesthesia to 5 min after intubation ( seven time points ) . RESULTS Tracheal intubation was successful in all patients without requiring neuromuscular blocking agent . Intubating conditions were clinical ly acceptable in 10 of 20 patients ( 50 % ) in Gp I compared with 18 of 20 patients ( 90 % ) in Gp II ( P < 0.05 ) . MAP and HR decreased in both groups after induction of anaesthesia ( P < 0.01 ) . Both HR and MAP were significantly lower in Gp II compared with Gp I after tracheal intubation ( P < 0.01 ) . No patient in the present study developed bradycardia or hypotension . CONCLUSIONS We conclude that remifentanil ( 3 microg.kg(-1 ) ) , administered before propofol ( 3 mg.kg(-1 ) ) provides acceptable tracheal intubating conditions in children , and completely inhibited the increase in HR and MAP associated with intubation Forty healthy children , aged between two and 12 years of age undergoing elective surgery where the anaesthetic technique involved tracheal intubation followed by spontaneous ventilation were studied . Induction of anaesthesia was with either alfentanil 15 micrograms.kg-1 or remifentanil 1 microgram.kg-1 followed by propofol 4 mg.kg-1 to which lignocaine 0.2 mg.kg-1 had been added . Intubating conditions were grade d on a four point scale for ease of laryngoscopy , vocal cord position , degree of coughing , jaw relaxation and limb movement . All children were successfully intubated at the first attempt . There were no significant differences in the assessment s of intubating conditions between the two groups . Arterial blood pressure and heart changes were similar in the two groups with both alfentanil and remifentanil attenuating the haemodynamic response to tracheal intubation . The time taken to resumption of spontaneous ventilation was similar in both groups The aim of this study was to compare the effect of three different induction techniques , with or without neuromuscular block , on tracheal intubation , haemodynamic responses and cardiac rhythm . Ninety children , aged 1 - 3 years , undergoing day-case adenoidectomy were r and omly allocated to three groups : group TS received thiopentone 5 mg kg-1 and suxamethonium 1.5 mg kg-1 , group H 5 Vol.% halothane and group PA alfentanil 10 micrograms kg-1 and propofol 3 mg kg-1 for induction of anaesthesia . No anti-cholinergics were used . Holter-monitoring of the heart rate and rhythm was started at least 15 min before induction of anaesthesia and continued until 3 min after intubation . Tracheal intubation was performed by an anaesthetist blinded to the induction method and judged as excellent , moderate or poor according to ease of laryngoscopy , position of vocal cords and incidence of coughing after intubation . Tracheal intubation was successful at the first attempt in all children in groups TS and H and but only in 80 % in group PA ( P = 0.001 ) . Intubating conditions were excellent in 22 ( 73 % ) , 22 ( 73 % ) and one ( 3 % ) of the patients in groups TS , H and PA , respectively ( P = 0.001 ) . Cardiac dysrhythmias ( supraventricular extrasystole or junctional rhythm ) occurred in two ( 7 % ) patients in groups PA and H each ( NS ) . Bradycardia occurred in 0 ( 0 % ) , four ( 14 % ) and six ( 21 % ) children in groups TS , H and PA , respectively ( P = 0.007 PA vs. TS , P = 0.03 H vs. TS ) . In conclusion , induction of anaesthesia with propofol 3 mg kg-1 and alfentanil 10 micrograms kg-1 without neuromuscular block did not provide acceptable intubating conditions in children 1 - 3 years , although it preserved arterial pressure better than thiopentone/suxamethonium or halothane . Cardiac dysrhythmias were few regardless of the induction method We determined the dose-response curves and effective doses of propofol for insertion of the laryngeal mask airway ( LMA ) in 50 unpremedicated children and in 60 children premedicated with midazolam , aged 3 - 12 yr . One of several doses of propofol was administered i.v . over 15 s to groups of 10 children , and conditions for LMA insertion were assessed at 60 s. The dose-response curves were parallel ( P = 0.94 ) , but the curve for premedicated children was shifted significantly to the left of that for unpremedicated children and propofol requirements were reduced by one-third ( P < 0.0001 ) . The doses required for satisfactory LMA insertion in 50 % and 90 % of unpremedicated patients ( ED50 , ED90 ) ( 95 % confidence interval ) were 3.8 ( 3.4 - 4.2 ) mg kg-1 and 5.4 ( 4.7 - 6.8 ) mg kg-1 , respectively ; those for premedicated patients were 2.6 ( 2.2 - 2.8 ) mg kg-1 and 3.6 ( 3.2 - 4.3 ) mg kg-1 , respectively The ability of alfentanil 15 μg kg‐1 or 30 μg kg‐1 to improve intubating conditions was studied in four groups of 25 ASA class 1 patients . Induction of anaesthesia was with thiopentone 5 mg kg‐1 . Neuromuscular blockade was induced with vecuronium using the priming principle . The priming dose , priming interval and intubating dose were 0.01 mg kg‐1 , 4 min , and 0.1 mg kg‐1 , respectively . Intubation was attempted 1 min after the intubating dose . Intubating conditions were judged unacceptable in about 30 % of the patients belonging to the control groups . Alfentanil 15 μg kg‐1 , when administered 65 s before intubation , reduced the incidence of coughing and diaphragmatic movement ( P<0.05 ) but did not reduce the incidence of overall unacceptable intubating conditions . Alfentanil 30 ug kg‐1 , however , reduced the incidence of vocal cord movement ( P<0.005 ) as well as coughing and diaphragmatic movement ( P<0.002 ) . Alfentanil 30 ug kg‐1 reduced the incidence of unacceptable intubating conditions from about 30 % to 4 % ( P<0.02 ) BACKGROUND Sevoflurane can be used as a sole agent for intubation in children , but studies have suggested that it is associated with emergence agitation . Fentanyl infusions can be used both to facilitate intubation and decrease emergence agitation . We investigated the effects of fentanyl on conditions at intubation and on emergence from sevoflurane anaesthesia without confounding nitrous oxide or premedication . METHODS IRB approval and informed consent were obtained . Subjects comprised 150 ASA physical status I or II ( age , 2 - 6 yr ) . Anaesthesia was induced with sevoflurane in oxygen and maintained using a predetermined concentration of sevoflurane . Subjects were r and omly allocated to receive one of three doses of fentanyl : vehicle only ( control group ) , a bolus dose of 1 microg kg(-1 ) followed by a continuous infusion of 0.5 microg kg(-1 ) h(-1 ) ( F1 group ) , or a bolus dose of 2 microg kg(-1 ) followed by a continuous infusion of 1 microg kg(-1 ) h(-1 ) ( F2 group ) . Sevoflurane minimum alveolar concentration for tracheal intubation ( MAC(TI ) ) and emergence agitation score were assessed . RESULTS MAC(TI ) values were 2.49 % , 1.61 % , and 1.16 % in control , F1 , and F2 groups , respectively ( P<0.05 ) . Agitation scores were 11.5 , 7.0 , and 2.6 in control , F1 , and F2 groups , respectively ( P<0.05 ) . CONCLUSIONS Fentanyl infusion consisting of a bolus dose of 2 microg kg(-1 ) followed by a continuous infusion of 1 microg kg(-1 ) h(-1 ) facilitates tracheal intubation and smooth emergence in children anaesthetized using sevoflurane . CLINICAL TRIAL REGISTRATION this study was started in 2000 and was finished in 2008 . We had no registration number . IRB approval was obtained BACKGROUND To study the interaction between nitrous oxide and sevoflurane during trachea intubation , the authors determined the minimum alveolar concentration of sevoflurane for tracheal intubation ( MAC(TI ) ) with and without nitrous oxide in children . METHODS Seventy-two children aged 1 - 7 yr were assigned r and omly to receive one of three end-tidal concentrations of nitrous oxide and one of four end-tidal concentrations of sevoflurane : 0 % nitrous oxide with 2.0 , 2.5 , 3.0 , or 3.5 % sevoflurane : 33 % nitrous oxide with 1.5 , 2.0 , 2.5 , or 3.0 % sevoflurane ; or 66 % nitrous oxide with 1.0 , 1.5 , 2.0 , or 2.5 % sevoflurane . After steady state end-tidal anesthetic concentrations were maintained for at least 10 min , laryngoscopy and intubation were attempted using a straight-blade laryngoscope and an uncuffed tracheal tube . The interaction between nitrous oxide and sevoflurane was investigated using logistic regression analysis of the responses to intubation . RESULTS Logistic regression curves of the probability of no movement in response to intubation in the presence of sevoflurane and 0 , 33 , and 66 % nitrous oxide were parallel . The interaction coefficient between nitrous oxide and sevoflurane did not differ significantly from zero ( P = 0.89 ) and was removed from the logistic model . The MAC(TI ) ( + /- SE ) of sevoflurane was 2.66+/-0.16 % , and the concentration of sevoflurane required to prevent movement in 95 % of children was 3.54+/-0.25 % . Thirty-three percent and 66 % nitrous oxide decreased the MAC(TI ) of sevoflurane by 18 % and 40 % ( P<0.001 ) , respectively . CONCLUSIONS We conclude that nitrous oxide and sevoflurane suppress the responses to tracheal intubation in a linear and additive fashion in children In a double‐blind study , intubating conditions and haemodynamic responses were assessed in two age‐groups of 45 ASA I‐II children , with mean ages of 2.4 and 6.3 years , premedicated with oral midazolam and atropine . The children were r and omly allocated to one of three groups : aifentanil 20 μg · kg‐1 + lidocaine 1 mg · kg‐1 ( AIRO + Lign ) ; alfentanil 20 μg μ kg‐1 ( Alf20 ) ; or alfentanil 40 μg · kg‐1 ( Alf40 ) , followed by propofol 3.5 mg · kg‐1 in the children aged 1–3 years and 3.0 mg · kg‐1 in the older children . Intubating conditions , 40 s after the administration of propofol , were assessed as good , moderate or impossible on the basis of jaw relaxation , ease of insertion of the endotracheal tube and coughing during intubation . In the younger age group the frequencies of good , moderate or impossible intubating conditions were 87 , 13 and 0 % in the Alf40 , 40 , 60 and 0 % in the Alf20 ( P < 0.05 compared to the Alf40 group ) and 53 , 47 and 0 % in the Alf20 + Lign group . In the older age group the corresponding frequencies were 60 , 33 and 7 % in the Alf20 + Lign , 47 , 53 and 0 % in the Alf20 and 47 , 40 and 13 % in the Alf40 group . All the drugs prevented any increase in arterial pressure and heart rate after tracheal intubation . The QTc interval of the EGG was always in the normal range . Clinical ly important bradycardia did not occur . In conclusion , the best intubating conditions occurred after propofol 3.5 mg · kg‐1 and alfentanil 40 μg · kg‐1 in the younger age group . In the other children good or moderate intubating conditions occurred in 87–100 % after all the drugs used in the present study We have assessed tracheal intubating conditions in 60 ASA I or II children , aged 3 - 12 yr , after induction of anaesthesia with alfentanil 5 , 10 or 15 micrograms kg-1 , followed by an induction dose of propofol . Neuromuscular blocking agents were not given . Three aspects of intubating conditions were assessed on a four-point scale : ease of laryngoscopy , vocal cord position and degree of coughing on insertion of the tracheal tube . The number of patients in whom each component of the assessment was satisfactory increased significantly as the dose of alfentanil increased ( ease of laryngoscopy P = 0.003 ; vocal cord position P = 0.0004 ; degree of coughing P = 0.018 ) . Intubation was successful in 70 % , 95 % and 95 % of patients after alfentanil 5 , 10 or 15 micrograms kg-1 , respectively , and conditions were considered to be excellent in 20 % , 70 % and 80 % of patients , respectively . Side effects included pain on injection of propofol ( 27 % ) , excitatory movements ( 5 % ) and bradycardia ( 1.7 % ) BACKGROUND Tracheal intubation during sevoflurane induction is frequently facilitated with i.v . propofol . We design ed a dose-response study to evaluate the intubating conditions , and the incidence and duration of apnea after i.v . propofol in children . METHODS / MATERIAL S Sixty healthy children were r and omly assigned to 0 , 0.5 , 1 , 2 or 3 mg x kg(-1 ) i.v . propofol during sevoflurane/nitrous oxide anesthesia . Tracheal intubation was performed approximately 30 s after propofol by an anesthesiologist who was blind to the treatment . The anesthesiologist assessed the responses to laryngoscopy and intubation using a st and ardized scale . Incidence and duration of apnea after propofol as well as heart rate , and systolic blood pressure before and after laryngoscopy were recorded . Data were analyzed using one-way and repeated measures ANOVA , the Jonckheere-Terpstra test , and logistic regression , with P < 0.05 accepted . RESULTS The laryngoscopy score after 3 mg x kg(-1 ) propofol was less than that after 0 mg x kg(-1 ) ( P < 0.01 ) and 0.5 mg x kg(-1 ) ( P < 0.05 ) . Incidence of apnea after propofol 3 mg x kg(-1 ) , 8/10 , was greater than after 0 mg x kg(-1 ) , 3/14 ( P < 0.011 ) and 0.5 mg x kg(-1 ) , 3/12 ( P < 0.03 ) . Duration of apnea after 3 mg x kg(-1 ) was greater than after 0 and 0.5 mg x kg(-1 ) ( P < 0.01 ) . The risk of apnea increased 1.83 fold for each 1 mg x kg(-1 ) dose increase in propofol ( P < 0.01 ) . Mean heart rate and systolic pressure decreased with the main effect , time . CONCLUSION During sevoflurane/nitrous oxide anesthesia , propofol 3 mg x kg(-1 ) provides superior intubating conditions with an increased incidence of and prolonged apnea compared with 0 and 0.5 mg x kg(-1 ) Background Vocal cord sequelae and postoperative hoarseness during general anesthesia are a significant source of morbidity for patients and a source of liability for anesthesiologists . Several risk factors leading to laryngeal injury have been identified in the past . However , whether the quality of tracheal intubation affects their incidence or severity is still unclear . Methods Eighty patients were r and omized in two groups ( n = 40 for each ) to receive a propofol – fentanyl induction regimen with or without atracurium . Intubation conditions were evaluated with the Copenhagen Score ; postoperative hoarseness was assessed at 24 , 48 , and 72 h by a st and ardized interview ; and vocal cords were examined by stroboscopy before and 24 and 72 h after surgery . If postoperative hoarseness or vocal cord sequelae persisted , follow-up examination was performed until complete restitution . Results Without atracurium , postoperative hoarseness occurred more often ( 16 vs. 6 patients ; P = 0.02 ) . The number of days with postoperative hoarseness was higher when atracurium was omitted ( 25 vs. 6 patients ; P < 0.001 ) . Similar findings were observed for vocal cord sequelae ( incidence of vocal cord sequelae : 15 vs. 3 patients , respectively , P = 0.002 ; days with vocal cord sequelae : 50 vs. 5 patients , respectively , P < 0.001 ) . Excellent intubating conditions were less frequently associated with postoperative hoarseness compared to good or poor conditions ( 11 , 29 , and 57 % of patients , respectively ; excellent vs. poor : P = 0.008 ) . Similar findings were observed for vocal cord sequelae ( 11 , 22 , and 50 % of patients , respectively ; excellent vs. poor : P = 0.02 ) . Conclusions The quality of tracheal intubation contributes to laryngeal morbidity , and excellent conditions are less frequently associated with postoperative hoarseness and vocal cord sequelae . Adding atracurium to a propofol – fentanyl induction regimen significantly improved the quality of tracheal intubation and decreased postoperative hoarseness and vocal cord sequelae In a blinded r and omized study intubating conditions were compared at one min following intravenous induction with propofol and either suxamethonium 1.0 mg.kg-1 , or rocuronium 0.6 mg.kg-1 . Onset time to maximal twitch depression , % block at one minute and clinical duration ( time to 25 % recovery ) were measured . Sixty children undergoing elective tonsillectomy were recruited . Onset time [ 42s ( SD 11s ) ] and clinical duration [ 3.3 min ( SD 1.0 min ) ] in the suxamethonium group was significantly ( P < 0.001 ) less than in the rocuronium group [ 92s ( 41s ) ] and [ 24.2 min ( 6.6 min ) ] respectively . The median twitch height at one minute for suxamethonium was 0 % ( range 0 - 8 % ) and significantly greater ( P < 0.001 ) at 5 % ( range 0 - 22 % ) for rocuronium . Despite this there was no difference in the intubating conditions at one minute with 25 excellent/5 good in the suxamethonium group and 27 excellent/3 good in the rocuronium group . We conclude that rocuronium 0.6 mg.kg-1 gives optimal intubating conditions at one minute in children Better definition of end points required to achieve successful tracheal intubation after induction with sevoflurane could improve patient care . The authors therefore design ed a study that could determine , with meaningful confidence intervals , the time required to successfully intubate 80 % of children by using 8 % inspired sevoflurane and no muscle relaxant . We hypothesized that the time required could vary by age or body mass index . One-hundred fifty-three ASA physical status I or II patients received induction with 8 % sevoflurane in 60 % nitrous oxide with discontinuation of nitrous oxide 1 min after the start of the induction . The time until laryngoscopy remained close to the time required to achieve 80 % successful intubation by varying induction time according to the success rate in each group of five patients . A probit model of induction time and age found that both were predictive of successful intubation ( P values of 0.006 and 0.02 , respectively ) . The induction times needed to achieve 80 % successful intubation were 137 s ( 95 % confidence interval , 94.6–159 s ) and 187 s ( 153–230 s ) for ages 1–4 yr and 4–8 yr , respectively . The persistence of spontaneous ventilation at the time of laryngoscopy , despite attempts to control ventilation , was associated with poor intubation conditions ( P < 0.001 ) The optimal dose of remifentanil needed to produce successful intubating conditions following inhalation induction of anaesthesia using 5 % sevoflurane without the use of neuromuscular blocking drugs , was investigated in 25 children aged 3–10 years . Sixty seconds after inhalation induction of anaesthesia using sevoflurane 5 % in 100 % oxygen , a predetermined dose of remifentanil was injected over 30 s. The dose of remifentanil was determined using the modified Dixon 's up‐ and ‐down method ( 0.2 μg.kg−1 as a step size ) . The first patient was tested at 1.0 μg.kg−1 remifentanil . Ninety seconds following the bolus administration of remifentanil , the child 's trachea was intubated . The optimal bolus dose of remifentanil required for successful tracheal intubation was 0.56 ( 0.15 ) μg.kg−1 in 50 % of children during inhalation induction using 5 % sevoflurane in the absence of neuromuscular blocking drugs . Using probit analysis , the 95 % effective dose ( ED95 ) of remifentanil was 0.75 μg.kg−1 ( 95 % confidence limits 0.63–1.38 μg.kg−1 )
11,006
21,488,706
Comprehensive cardiac rehabilitation has positive effects on many cardiac risk factors ( physical activity , smoking status , cholesterol , anxiety and depression ) and can lead to improvements in mortality , morbidity and quality of life . While there have been fewer studies of home-based cardiac rehabilitation , the available data suggest that it has comparable results to hospital-based programs . Many of these studies are small and heterogeneous in terms of interventions but home-based cardiac rehabilitation appears both safe and effective . Available evidence suggests that it results in longer lasting maintenance of physical activity levels compared with hospital-based rehabilitation and is equally effective in improving cardiac risk factors . Furthermore , it has the potential to be a more cost-effective intervention for patients who can not easily access their local centre or hospital . Currently home-based cardiac rehabilitation is not offered routinely to all patients but it appears to have the potential to increase uptake in patients who are unable , or less likely , to attend more traditional hospital-based cardiac rehabilitation programs
Most formal cardiac rehabilitation in the UK is offered within a hospital or centre setting , although this may not always be convenient or accessible for many cardiac patients , especially those in remote areas . The proportion of eligible patients who successfully complete a cardiac rehabilitation program remains low . There are many reasons for this but geographical isolation and transport issues are important . This systematic review examines the current evidence for home- versus hospital-based cardiac rehabilitation . Home-based cardiac rehabilitation offers greater accessibility to cardiac rehabilitation and has the potential to increase uptake .
Background —Whether cardiac rehabilitation ( CR ) is effective in patients older than 75 years , who have been excluded from most trials , remains unclear . We enrolled patients 46 to 86 years old in a r and omized trial and assessed the effects of 2 months of post-myocardial infa rct ion ( MI ) CR on total work capacity ( TWC , in kilograms per meter ) and health-related quality of life ( HRQL ) . Methods and Results —Of 773 screened patients , 270 without cardiac failure , dementia , disability , or contraindications to exercise were r and omized to outpatient , hospital-based CR ( Hosp-CR ) , home-based CR ( Home-CR ) , or no CR within 3 predefined age groups ( middle-aged , 45 to 65 years ; old , 66 to 75 years ; and very old , > 75 years ) of 90 patients each . TWC and HRQL were determined with cycle ergometry and Sickness Impact Profile at baseline , after CR , and 6 and 12 months later . Within each age group , TWC improved with Hosp-CR and Home-CR and was unchanged with no CR . The improvement was similar in middle-aged and old persons but smaller , although still significant , in very old patients . TWC reverted toward baseline by 12 months with Hosp-CR but not with Home-CR . HRQL improved in middle-aged and old CR and control patients but only with CR in very old patients . Complications were similar across treatment and age groups . Costs were lower for Home-CR than for Hosp-CR . Conclusions —Post-MI Hosp-CR and Home-CR are similarly effective in the short term and improve TWC and HRQL in each age group . However , with lower costs and more prolonged positive effects , Home-CR may be the treatment of choice in low-risk older patients BACKGROUND Increases in life stress have been linked to poor prognosis , after myocardial infa rct ion ( MI ) . Previous research suggested that a programme of monthly screening for psychological distress , combined with supportive and educational home nursing interventions for distressed patients , may improve post-MI survival among men . Our study assessed this approach for both men and women . We aim ed to find out whether the programme would reduce 1-year cardiac mortality for women and men . METHODS We carried out a r and omised , controlled trial of 1376 post-MI patients ( 903 men , 473 women ) assigned to the intervention programme ( n = 692 ) or usual care ( n = 684 ) for 1 year . All patients completed a baseline interview that included assessment of depression and anxiety . Survivors were also interviewed at 1 year . FINDINGS The programme had no overall survival impact . Preplanned analyses showed higher cardiac ( 9.4 vs 5.0 % , p = 0.064 ) and all-cause mortality ( 10.3 vs 5.4 % , p = 0.051 ) among women in the intervention group . There was no evidence of either benefit or harm among men ( cardiac mortality 2.4 vs 2.5 % , p = 0.94 ; all-cause mortality 3.1 vs 3.1 % , p = 0.93 ) . The programme 's impact on depression and anxiety among survivors was small . INTERPRETATION Our results do not warrant the routine implementation of programmes that involve psychological-distress screening and home nursing intervention for patients recovering from MI . The poorer overall outcome for women , and the possible harmful impact of the intervention on women , underline the need for further research and the inclusion of adequate numbers of women in future post-MI trials BACKGROUND Participation in cardiac rehabilitation after acute myocardial infa rct ion is sub-optimal . Offering home-based rehabilitation may improve uptake . We report the first r and omized study of cardiac rehabilitation to include patient preference . AIM To compare the clinical effectiveness of a home-based rehabilitation with hospital-based rehabilitation after myocardial infa rct ion and to determine whether patient choice affects clinical outcomes . DESIGN Pragmatic r and omized controlled trial with patient preference arms . SETTING Rural South West Engl and . METHODS Patients admitted with uncomplicated myocardial infa rct ion were offered hospital-based rehabilitation classes over 8 - 10 weeks or a self-help package of six weeks ' duration ( the Heart Manual ) supported by a nurse . Primary outcomes at 9 months were mean depression and anxiety scores on the Hospital Anxiety Depression scale , quality of life after myocardial infa rct ion ( MacNew ) score and serum total cholesterol . RESULTS Of the 230 patients who agreed to participate , 104 ( 45 % ) consented to r and omization and 126 ( 55 % ) chose their rehabilitation programme . Nine month follow-up data were available for 84/104 ( 81 % ) r and omized and 100/126 ( 79 % ) preference patients . At follow-up no difference was seen in the change in mean depression scores between the r and omized home and hospital-based groups ( mean difference : 0 ; 95 % confidence interval , -1.12 to 1.12 ) nor mean anxiety score ( -0.07 ; -1.42 to 1.28 ) , mean global MacNew score ( 0.14 ; -0.35 to 0.62 ) and mean total cholesterol levels ( -0.18 ; -0.62 to 0.27 ) . Neither were there any significant differences in outcomes between the preference groups . CONCLUSIONS Home-based cardiac rehabilitation with the Heart Manual was as effective as hospital-based rehabilitation for patients after myocardial infa rct ion . Choosing a rehabilitation programme did not significantly affect clinical outcomes BACKGROUND Disease management programs ( DMPs ) that use multidisciplinary teams and specialized clinics reduce hospital admissions and improve quality of life and functional status . Evaluations of cardiac DMPs delivered by home health nurses are required . METHODS Between August 1999 and August 2000 we identified consecutive patients admitted to hospital with elevated cardiac enzymes . Patients who agreed were r and omly assigned to participate in a DMP or to receive usual care . The DMP included 6 home visits by a cardiac-trained nurse , a st and ardized nurses ' checklist , referral criteria for specialty care , communication with the family physician and patient education . We measured readmission days per 1000 follow-up days for angina , congestive heart failure ( CHF ) and chronic obstructive pulmonary disease ( COPD ) ; all-cause readmission days ; and provincial cl aims for emergency department visits , physician visits , diagnostic or therapeutic services and laboratory services . RESULTS We screened 715 consecutive patients admitted with elevated cardiac markers between August 1999 and August 2000 . Of those screened 71 DMP and 75 usual care patients met the diagnostic criteria for myocardial infa rct ion , were eligible for visits from a home health nurse and consented to participate in the study . Readmission days for angina , CHF and COPD per 1000 follow-up days were significantly higher for usual care patients than for DMP patients ( incidence density ratio [ IDR ] = 1.59 , 95 % confidence interval [ CI ] 1.27 - 2.00 , p < 0.001 ) . All-cause readmission days per 1000 follow-up days were significantly higher for usual care patients than for DMP patients ( IDR = 1.53 , 95 % CI 1.37 - 1.71 , p < 0.001 ) . The difference in emergency department encounters per 1000 follow-up days was significant ( IDR = 2.08 , 95 % CI 1.56 - 2.77 , p < 0.001 ) . During the first 25 days after discharge , there were significantly fewer provincial cl aims su bmi tted for DMP patients than for usual care patients for emergency department visits ( p = 0.007 ) , diagnostic or therapeutic services ( p = 0.012 ) and laboratory services ( p = 0.007 ) . INTERPRETATION The results provide evidence that an appropriately developed and implemented community-based inner-city DMP delivered by home health nurses has a positive impact on patient outcomes Background : Psychological morbidity after an acute myocardial infa rct ion ( AMI ) is known to be common , but can be addressed by appropriate rehabilitation . The area in which this research was conducted experiences high rates of deprivation and of coronary heart disease and limited access to hospital-based rehabilitation . Responding to concern about psychological needs of AMI patients , a self-help package was introduced and evaluated alongside st and ard hospital-based cardiac rehabilitation . Aims : To evaluate the impact of a home-based self-help package ( the Heart Manual ) , alongside existing cardiac rehabilitation provision , on psychological morbidity and health status after AMI . A secondary aim was to assess the suitability of the Heart Manual for older patients aged over 80 years . Methods : A controlled observational study , comparing two cohorts of patients discharged from hospital after AMI . The intervention group was given the self-help package in addition to st and ard care . The control group received st and ard care alone . Outcome measures used were the Hospital Anxiety and Depression Scale and the EuroQol . Results : The intervention group showed significant improvement in anxiety and depression scores after 3 months and nonsignificant improvement in general health status . Patients who attended hospital-based rehabilitation classes , and those aged over 80 years , also benefited from the intervention . Conclusion : A home-based self-help rehabilitation package is an effective tool alongside hospital-based rehabilitation classes and can be given to all age groups This prospect i ve study evaluated the effect of an individualized , comprehensive , home-based cardiac rehabilitation program combining exercise training with risk factor modification and psychosocial counseling on risk factors , psychological well-being , functional capacity , and work resumption in 99 post-percutaneous coronary interventions ( PCI ) patients r and omized to control ( st and ard care plus telephone follow-up , n=49 ) or intervention ( individualized , comprehensive , home-based cardiac rehabilitation , n=50 ) groups . Data were collected at time 1 ( T(1 ) ) during hospital admission , time 2 ( T(2 ) ) approximately 2 months post-PCI , and time 3 ( T(3 ) ) approximately 12 months post-PCI . Results suggest that the allocation to an individualized , comprehensive , home-based cardiac rehabilitation program provided more advantageous outcomes . At both follow-ups , the intervention group showed within-group improvement in serum cholesterol levels ( P<0.02 ; P<0.01 ) and exercise participation ( P<0.001 ; P<0.001 ) with differences in exercise participation favoring the intervention group ( P<0.01 ) at T(2 ) . Repeated measures ANOVA showed significant improvements over time in body mass index ( BMI ) ( P<0.01 ) , psychological well-being ( P<0.001 ) , and functional capacity ( P<0.001 ) for both groups . More patients in the intervention group had returned to work at T(2 ) ( P<0.001 ) and did so more quickly ( P<0.01 ) . These findings suggest that an individualized , comprehensive , home-based cardiac rehabilitation program improves risk factor profiles and work resumption patterns for patients following PCI BACKGROUND Previous reports indicate risk factors and lifestyle behaviors may deteriorate early after completion of a cardiac rehabilitation program ( CRP ) . We hypothesized that a modest risk factor and lifestyle management intervention after a CRP would significantly reduce overall cardiovascular risk using the Framingham risk score compared with usual care after 4 years . METHODS Patients with ischemic heart disease ( n = 302 ) were r and omized after a CRP to either usual care or intervention ( exercise sessions , telephone follow-ups , counseling sessions , and reports to the participants ' family physicians ) . The Framingham risk score , risk factors , and lifestyle behaviors were compared after 4 years . RESULTS Data were available for 130 intervention and 119 usual care participants . The intervention result ed in 15.5 hours of direct participant contact . Framingham score , total cholesterol , low-density lipoprotein cholesterol , and systolic blood pressure were significantly improved in the intervention group after adjusting for baseline factors . There were no significant differences with respect to lifestyle factors between the groups . CONCLUSIONS A modest risk factor and lifestyle management intervention result ed in a significant reduction to global risk compared with usual care and should be considered after CRP Background Home-based cardiac rehabilitation ( CR ) has been demonstrated to be as effective as institution-based CR in post-coronary artery bypass graft surgery ( CABG ) patients in terms of short-term physical and psychosocial outcomes . The sustainability of these effects is less well studied . The aim of this study was to examine the sustainability of observed changes in physical , quality of life ( HRQL ) , and social support ( SS ) outcomes in patients 12 months after discharge from a r and omized controlled trial ( RCT ) of 6 months of monitored home-based versus supervised hospital-based CR . Design Two-hundred and twenty-two ( n = 222 ) patients were followed-up 12 months after discharge from a RCT of 6 months of monitored ‘ Home ' versus supervised ‘ Hospital ' CR after CABG . Methods At discharge from the 6-month RCT , participants who consented to the 12-month follow-up study , were given individualized guidelines for ongoing exercise , and were not contacted for 1 year . The primary outcome was peak oxygen uptake ( VO2 ) . Secondary outcomes were : HRQL , SS and habitual physical activity . Results One hundred and ninety-eight patients ( 89.2 % ) , 102 ‘ Hospital ' and 96 ‘ Home ' , returned for follow-up 12-months after discharge from CR . Both groups had similar medical and socio-demographic characteristics . Peak VO2 declined in ‘ Hospital ' but was sustained in ‘ Home ' patients 12 months after discharge from CR ( P=0.002 ) . Physical HRQL was higher in the ‘ Home ' group at the 12-month follow-up ( P<0.01 ) . Mental HRQL showed general , minor deterioration over time in both groups ( P=0.019 ) . Twelve months after discharge from CR , physical and mental HRQL remained higher than at entry to CR in both groups . ‘ Home ' patients had higher habitual physical activity scores compared to ‘ Hospital ' patients . Conclusions This follow-up study suggests that low-risk patients whose CR is initiated in the home environment may be more likely to sustain positive physical and psychosocial changes over time than patients whose program is initially institution-based BACKGROUND Diastolic heart failure ( DHF ) is common in older women . There have been no clinical trials that have identified therapies to improve symptoms in these patients . A total of 32 women with New York Heart Association class II and III DHF ( left ventricular ejection fraction > 45 % and symptoms of dyspnea or fatigue ) were r and omized into a 12-week home-based , low-to-moderate intensity ( 40 % and 60 % , respectively ) exercise and education program ( intervention ) or education only program ( control ) . Methods and results The intervention group improved in the 6-minute walk test from 840 + /- 366 ft to 1043 + /- 317 ft versus 824 + /- 367 ft to 732 + /- 408 ft in the control group ( P = .002 ) . Quality of life also improved in the intervention group compared with the control group as measured by the Living with Heart Failure Question naire ( 41 + /- 26 to 24 + /- 18 vs 27 + /- 18 to 28 + /- 22 at 12 weeks , P = .002 ; 24 + /- 18 to 19 + /- 18 vs 28 + /- 22 to 32 + /- 27 at the 3-month follow-up , P = .014 ) and the Geriatric Depression Scale ( 6 + /- 4 to 4 + /- 4 vs 5 + /- 3 to 7 + /- 5 at 12 weeks , P = .012 ; 4 + /- 4 to 4 + /- 4 vs 7 + /- 5 to 7 + /- 5 at the 3-month follow-up , P = .009 ) . CONCLUSIONS Women with DHF exhibit significant comorbidities and physical limitations . Home-based , low-to-moderate intensity exercise , in addition to education , is an effective strategy for improving the functional capacity and quality of life in women with DHF . Further study is needed to assess the long-term effect of exercise on clinical outcomes BACKGROUND Home-based cardiac rehabilitation offers an alternative to traditional , hospital-based cardiac rehabilitation . AIM To compare the cost effectiveness of home-based cardiac rehabilitation and hospital-based cardiac rehabilitation . METHODS 104 patients with an uncomplicated acute myocardial infa rct ion and without major comorbidity were r and omized to receive home-based rehabilitation ( n=60 ) i.e. nurse facilitated , self-help package of 6 weeks ' duration ( the Heart Manual ) or hospital-based rehabilitation for 8 - 10 weeks ( n=44 ) . Complete economic data were available in 80 patients ( 48 who received home-based rehabilitation and 32 who received hospital-based rehabilitation ) . Healthcare costs , patient costs , and quality of life ( EQ-5D4.13 ) were assessed over the 9 months of the study . RESULTS The cost of running the home-based rehabilitation programme was slightly lower than that of the hospital-based programme ( mean ( 95 % confidence interval ) difference - 30 pounds sterling ( - 45 pounds sterling to - 12 pounds sterling ) [ -44 euro , -67 euro to -18 euro ] per patient . The cost difference was largely the result of reduced personnel costs . Over the 9 months of the study , no significant difference was seen between the two groups in overall healthcare costs ( 78 pounds sterling , - 1102 pounds sterling to 1191 pounds sterling [ -115 euro , -1631 euro to -1763 euro ] per patient ) or quality adjusted life-years ( -0.06 ( -0.15 to 0.02 ) ) . The lack of significant difference between home-based rehabilitation and hospital-based rehabilitation did not alter when different costs and different methods of analysis were used . CONCLUSIONS The health gain and total healthcare costs of the present hospital-based and home-based cardiac rehabilitation programmes for patients after myocardial infa rct ion appear to be similar . These initial results require affirmation by further economic evaluations of cardiac rehabilitation in different setting PURPOSE Despite the documented benefits of participating in rehabilitation programs , access to cardiac rehabilitation is limited for a large number of people with coronary artery disease ( CAD ) . There is potential to increase participation in exercise training if home-based exercise were a viable option . METHODS We conducted a retrospective data base review of 1,042 patients who took part in exercise rehabilitation following coronary artery bypass graft surgery ( CABGS ) between 1992 and 1998 . Of these , 713 patients took part in supervised exercise , and 329 were in an unsupervised , home-based group . All exercise protocol s were based upon American College of Sports Medicine guidelines , and patients in both groups received exercise prescriptions that were similar in intensity , frequency , and duration . RESULTS There were no differences between groups at baseline . Following 6 months of exercise training , there were substantial improvements in peak VO2 , peak workload , and peak MET levels in both the supervised and unsupervised groups ( P < 0.0001 ) . Patients in the supervised group had significant improvements in both LDL and HDL-cholesterol , whereas the home-based group showed improvement in HDL-cholesterol only . When analyzed by sex , men performed better than women for all measures of exercise capacity ; however , women in both groups showed approximate 20 % improvements ( P < 0.05 ) in exercise capacity as well as improvements in HDL-cholesterol . CONCLUSION Stable post CABGS patients who receive a detailed exercise prescription to follow at home do as well as those in supervised rehabilitation BACKGROUND Hospital and exercise-based cardiac rehabilitation programmes do not suit many older patients and home-based rehabilitation may be more effective . OBJECTIVE To evaluate a home-based intervention for patients aged 65 years or over discharged home from hospital after emergency admission for suspected myocardial infa rct ion . DESIGN A single-blind r and omised controlled trial comparing home-based intervention by a nurse with usual care . SUBJECTS Patients aged 65 years or over discharged home after hospitalisation with suspected myocardial infa rct ion ( n= 324 ) . INTERVENTION Home-based intervention ( n = 163 ) consisted of home visits at 1 - 2 and 6 - 8 weeks after hospital discharge by a nurse who encouraged compliance with and knowledge of their treatment regimen , offered support and guidance about resuming daily activities , and involved other community services as appropriate . MEASUREMENTS Up to 100 days after admission , data were collected on deaths , hospital readmissions and use of outpatient services . Survivors were sent a postal question naire to assess activities of daily living and quality of life . RESULTS At 100 day follow-up there was no difference in deaths , activities of daily living or overall quality of life , but those in the intervention group scored significantly better on the confidence and self-esteem subsections . The intervention group had fewer hospital readmissions ( 35 versus 51 , relative risk 0.68 , 95 % CI 0.47 - 0.98 , P < 0.05 ) and fewer days of hospitalisation after initial discharge ( mean difference -1.7 , 95 % CI -2.09 to -1.31 , P < 0.05 ) . A total of 42/43 individuals in the intervention group had resumed driving at follow-up , compared with 32/43 in the usual care group ( observed difference between proportions 23 % , 95 % CI 9 - 37 % , P < 0.05 ) . CONCLUSION Amongst older patients discharged home after hospitalisation for suspected myocardial infa rct ion , home-based nurse intervention may improve confidence and self-esteem , and reduce early hospital readmissions BACKGROUND There are approximately 1.8 million patients with angina in the United Kingdom , many of whom report a poor quality of life , including raised levels of anxiety and depression . AIM To evaluate the effect of a cognitive behavioural disease management programme , the Angina Plan , on psychological adjustment in patients newly diagnosed with angina pectoris . DESIGN OF STUDY R and omised controlled trial . SETTING Patients from GP practice s in a Northern UK city ( York ) between April 1999 and May 2000 . METHOD Recruited patients were r and omised to receive the Angina Plan or to a routine , practice nurse-led secondary prevention educational session . RESULTS Twenty of the 25 practice s invited to join the study supplied patients ' names ; 142 patients attended an assessment clinic and were r and omised There were no significant differences in any baseline measures . At the six month post-treatment follow-up , 130 ( 91 % ) patients were reassessed . When compared with the educational session patients ( using analysis of covariance adjusted for baseline scores in an intention-to-treat analysis ) Angina Plan patients showed a greater reduction in anxiety ( P = 0.05 ) and depression ( P = 0.01 ) , the frequency of angina ( reduced by three episodes per week , versus a reduction of 0.4 per week , P = 0.016 ) the use of glyceryl trinitrate ( reduced by 4.19 fewer doses per week versus a reduction of 0.59 per week , P = 0.018 ) , and physical limitations ( P<0.001 : Seattle Angina Question naire ) . They were also more likely to report having changed their diet ( 41 versus 21 , P<0.001 ) and increased their daily walking ( 30 versus 2 , P<0.001 ) . There was no significant difference between the groups on the other sub-scales of the Seattle Angina Question naire or in any of the medical variables measured . CONCLUSION The Angina Plan appears to improve the psychological , symptomatic , and functional status of patients newly diagnosed with angina Common concerns with the traditional protocol ( TP ) for cardiac rehabilitation include suboptimal program participation , poor facilitation of independent exercise , the use of costly continuous electrocardiographic ( ECG ) monitoring , and lack of insurance reimbursement . To address these concerns , a reduced cost-modified protocol ( MP ) was developed to promote independent exercise . Eighty low- to moderate-risk cardiac patients were r and omized to a TP ( n = 42 ) or a MP ( n = 38 ) and were compared over 6 months on program participation , exercise adherence , cardiovascular outcomes , and program costs . During month 1 , patients followed identical regimens , including 3 ECG-monitored exercise sessions/week , with encouragement to achieve > /=5 thirty-minute sessions/week . In week 5 , the TP continued with a facility-based regimen including 3 exercise sessions/week for 6 months and used ECG monitoring the initial 3 months . The MP discontinued ECG monitoring in week 5 and were gradually weaned to an off-site exercise regimen that was complemented with educational support meetings and telephone follow-up . Compared with TP patients , MP patients had higher rates of off-site exercise over 6 months ( p = 0.05 ) , and total exercise ( on site + off site ) during the final 3 months ( p = 0.03 ) . Also , MP patients were less likely to drop out ( p = 0.05 ) . Both protocol s promoted comparable improvements in maximal oxygen uptake ( p < 0.05 ) , blood lipids ( p < 0.001 ) , and hemodynamic measurements ( p < 0.002 ) . The MP cost $ 738 less/patient than the TP and required 30 % less staff ( full-time equivalents ) . These results suggest that a reduced cost MP was as effective as an established TP in improving physiologic outcomes while demonstrating higher rates of exercise adherence and program participation . Thus , the MP or a similar protocol has applicability to hospitals with large capitated or managed care population s to provide cost-effective cardiovascular risk reduction to patients Two hundred patients who had suffered an acute myocardial infa rct ion 4 - 6 weeks before entered a r and omised controlled trial of exercise treatment at a community sports centre supervised by a general practitioner . Eighty one per cent of the treatment group continued to exercise until they returned to work and 73 % completed three months ' exercise . There were no serious complications of the exercise course . The prevalence of angina pectoris fell by 10 % in the treatment group but rose by 60 % in the control group . The perceived energy level rose by significantly more in the treatment group than in the controls . The rise in predicted maximum oxygen uptake was significantly greater in the treatment group than in the control group as was the reduction in the double product ( a reflection of myocardial workload ) at peak exercise . Coronary rehabilitation in the community can be both safe and effective The Ontario Exercise-Heart Collaborative Study was a multicenter r and omized clinical trial of high intensity exercise for the prevention of recurrent myocardial infa rct ion in 733 men . Of the 678 subjects who could have participated for at least 3 years , 315 ( 46.5 % ) dropped out . Stepwise multiple linear logistic regression analysis was carried out to examine the relation between subject characteristics and the probability of dropping out during the study . Analysis was performed on the entry group as a whole by considering those subjects who had reinfa rct ion while complying with the program and also by excluding all subjects with reinfa rct ions . The consistent and statistically significant predictors of dropout in both analyses were smoking and a blue collar occupation . Angina was significantly associated with dropout only when reinfa rct ions were excluded . It may be important to consider these factors when investigating the potential for compliance-improving strategies in reducing dropout from exercise rehabilitation programs PURPOSE Anxiety and depressive disorders have been established as independent risk factors for the development of and recovery from coronary heart disease ( CHD ) . However , few studies have reported on the prevalence and personal characteristics of comorbid psychiatric disorders ( PD ) among cardiac population s. This project examined the prevalence of comorbid depressive and anxiety disorders among men and women with CHD commencing cardiac rehabilitation ( CR ) and the demographic , medical , and psychosocial characteristics among those meeting multiple PD criteria . METHODS Participants were 143 CHD patients ( M age , 61 years ; SD , 11.2 ; 70 % men , 91 % Caucasian , 64 % married ) entering CR who were evaluated via a semistructured , psychiatric interview to assess both current and lifetime prevalence rates of PD . Demographic , medical , and psychosocial variables were also assessed . RESULTS Approximately 45 % met criteria for at least 1 anxiety disorder , and 20 % met criteria for either major depressive disorder or dysthymic disorder either at the time of evaluation or in their lifetime . Across all participants , 26 % met criteria for ≥2 PD . Of those with a depressive disorder , 76 % also met criteria for at least 1 anxiety disorder . Participants with comorbid PD were of younger age and female and reported less education ( P < .01 ) . Comorbidity was also associated with self-reported overall diminished physical , emotional , and social quality of life , depression , and anxiety . CONCLUSION Comorbid PD are highly prevalent in the CR setting and are associated with specific demographic characteristics and reduced quality of life . These data offer additional support that routine screening for PD is warranted in outpatient cardiac setting
11,007
27,661,111
The effect of low TSR on poor OS was observed among various cancer types , but not in the early stage of cervical caner . Subgroup analyses indicated that cancer type , clinical stage , study region , blinding status , and NOS score did not affect the prognostic value of TSR for DFS . Moreover , low TSR was significantly correlated with the serious clinical stage , advanced depth of invasion , and positive lymph node metastasis . These findings indicate that a high proportion of stroma in cancer tissue is associated with poor clinical outcomes in cancer patients , and TSR may serve as an independent prognostic factor for solid tumors
Tumor-related stroma plays an active role in tumor invasion and metastasis . The tumor – stroma ratio ( TSR ) in the pathologic specimen has drawn increasing attention from the field of predicting tumor prognosis . However , the prognostic value of TSR in solid tumors necessitates further elucidation . We conducted a meta- analysis on 14 studies with 4238 patients through a comprehensive electronic search on data bases up date d on May 2016 to explore the relationship between TSR and prognosis of solid tumors .
Background The study aim ed to determine the prognostic impact of clinical and pathological factors on survival among patients with small cell neuroendocrine carcinoma ( SNEC ) , adenocarcinoma ( ADC ) , and squamous cell carcinoma ( SCC ) . Methods Eligible participants were all patients with histologically confirmed cervical cancer treated at Chiang Mai University Hospital between 1995 and 2011 . We included all patients with SNEC and r and omly enrolled patients with ADC and SCC . We used competing-risk regression analysis to examine the risk of cancer-related death by histological type . Results We included 130 ( 6.2 % ) women with SNEC , 346 ( 16.4 % ) with ADC , and 1,632 ( 77.4 % ) with SCC . Age > 60 years ( hazard ratio [ HR ] 4.9 , 95 % confidence interval [ CI ] 2.0–12.0 ) and lymph node involvement ( HR 3.0 , 95 % CI 1.2–7.4 ) were prognostic factors among surgically-treated patients with SNEC . Deeper stromal invasion ( HR 3.6 , 95 % CI 1.6–8.3 ) was a prognostic factor in patients with SCC . In patients with advanced SNEC , age > 60 years had a strong prognostic impact ( HR 2.6 , 95 % CI 1.0–6.5 ) while the International Federation of Gynecology and Obstetrics stages III and IV were prognostic factors for patients with advanced stage ADC ( HR 2.9 , 95 % CI 2.0–4.4 and HR 4.5 , 95 % CI 2.6–7.9 , respectively ) and SCC ( HR 1.7 , 95 % CI 1.4–2.0 and HR 3.7 , 95 % CI 2.8–4.9 , respectively ) compared with the International Federation of Gynecology and Obstetrics stage IIB . Conclusion Clinical and pathological prognostic factors in cervical cancer differed according to histological type . Taking the important prognostic factors for each histological type into consideration may be beneficial for tailored treatment and follow-up planning BACKGROUND Clarifying the prognostic impact of histological type is an essential issue that may influence the treatment and follow-up planning of newly diagnosed cervical cancer cases . This study aim ed to evaluate the prognostic impact of histological type on survival and mortality in patients with cervical squamous cell carcinoma ( SCC ) , adenocarcinoma ( ADC ) and small cell neuroendocrine carcinoma ( SNEC ) . MATERIAL S AND METHODS All patients with cervical cancer diagnosed and treated at Chiang Mai University Hospital between January 1995 and October 2011 were eligible . We included all patients with SNEC and a r and om weighted sample of patients with SCC and ADC . We used competing-risks regression analysis to evaluate the association between histological type and cancer-specific survival and mortality . RESULTS Of all 2,108 patients , 1,632 ( 77.4 % ) had SCC , 346 ( 16.4 % ) had ADC and 130 ( 6.2 % ) had SNEC . Overall , five-year cancer-specific survival was 60.0 % , 54.7 % , and 48.4 % in patients with SCC , ADC and SNEC , respectively . After adjusting for other clinical and pathological factors , patients with SNEC and ADC had higher risk of cancer-related death compared with SCC patients ( hazard ratio [ HR ] 2.6 ; 95 % CI , 1.9 - 3.5 and HR 1.3 ; 95 % CI , 1.1 - 1.5 , respectively ) . Patients with SNEC were younger and had higher risk of cancer-related death in both early and advanced stages compared with SCC patients ( HR 4.9 ; 95 % CI , 2.7 - 9.1 and HR 2.5 ; 95 % CI , 1.7 - 3.5 , respectively ) . Those with advanced-stage ADC had a greater risk of cancer-related death ( HR 1.4 ; 95 % CI , 1.2 - 1.7 ) compared with those with advanced-stage SCC , while no significant difference was observed in patients with early stage lesions . CONCLUSION Histological type is an important prognostic factor among patients with cervical cancer in Thail and . Though patients with SNEC were younger and more often had a diagnosis of early stage compared with ADC and SCC , SNEC was associated with poorest survival . ADC was associated with poorer survival compared with SCC in advanced stages , while no difference was observed at early stages . Further tailored treatment-strategies and follow-up planning among patients with different histological types should be considered
11,008
27,655,373
Correlates demonstrated expected associations with health outcomes , low levels of physical activity ( particularly among young people ) , high levels of sedentary behaviour ( particularly among men and young people ) and expected associations of known correlates ( e.g. gender , age , education , time , self-motivation , social support , and access ) .
Background The dramatic rise in Noncommunicable Diseases ( NCD ) in the oil-producing countries of the Arabian Peninsula is driven in part by insufficient physical activity , one of the five main contributors to health risk in the region . The aim of this paper is to review the available evidence on physical activity and sedentary behaviour for this region . Based on the findings , we prioritize an agenda for research that could inform policy initiatives with regional relevance .
OBJECTIVES This study aim ed to investigate the lifestyle habits-physical activity ( PA ) , eating habits ( EH ) , and sleep duration (SD)-of Omani adolescents , and to examine gender differences in such variables . METHODS 802 Omani adolescents ( 442 females and 360 males ) , aged 15 - 18 years were r and omly recruited . Anthropometric indices , PA level , and EH and SD were evaluated by the Arab Teenage Lifestyle question naire . A semi-quantitative food frequency question naire for dietary assessment was also administered . RESULTS The results showed that although the study subjects had a sedentary lifestyle ( lack of PA , average of 6.7 hours sleep , and consumption of high calorie foods ) , they maintained a normal body mass ( less than 25 Kg/m(2 ) ) . Males were more than twice as active as females . With respect to EH , there were few gender differences , except in dairy and meat consumption where 62.5 % and 55.5 % of males consumed more than 3 servings , respectively , compared to 18.78 % and 35.2 % of females , respectively . In addition , waist/height ratio , height , reasons for being active , energy drinks , potato consumption , eating sweets , vigorous PA and breakfast EHs were statistically significant independent predictors for BMI , P < 0.05 for both males and females . CONCLUSION This study revealed a high prevalence of sedentary behaviors and a low level of physical activity , especially among females . Unhealthy dietary habits were also widely found among both genders . There is an urgent need for more research as well as a national policy promoting active living and healthy eating and discouraging sedentary behaviour among Omani adolescents The aim of present study was to determine the prevalence of prehypertension and associated risk factors among young adult females in Dammam , Saudi Arabia . A cross-sectional study was conducted on a sample of about one-third of female students enrolled in 4 colleges of the University of Dammam . They were screened for high blood pressure and associated cardiovascular risk factors by an interview question naire . Weight and height , waist and hip and blood pressure measurements and r and om blood glucose testing were done . The results revealed that 13.5 % of the 370 students were prehypertensive . The most prevalent risk factor for cardiovascular diseases was physical inactivity ( 53.2 % ) , followed by overweight/obesity ( 29.1 % ) ; 16.3 % of prehypertensive students had 3 or more risk factors . Logistic regression analysis revealed that overweight/ obesity was the strongest predictor of prehypertension . Our study indicates a need for routine blood pressure measurements and risk assessment in young adult females in Saudi Arabia Background A better underst and ing of the relationships between obesity and lifestyle factors is necessary for effective prevention and management of obesity in youth . Therefore , the objective of this study was to evaluate the associations between obesity measures and several lifestyle factors , including physical activity , sedentary behaviors and dietary habits among Saudi adolescents aged 14–19 years . Methods This was a school-based cross-sectional study that was conducted in three cities in Saudi Arabia ( Al-Khobar , Jeddah and Riyadh ) . The participants were 2906 secondary school males ( 1400 ) and females ( 1506 ) aged 14–19 years , who were r and omly selected using a multistage stratified cluster sampling technique . Measurements included weight , height , body mass index ( BMI ) , waist circumference , waist/height ratio ( WHtR ) , screen time ( television viewing , video games and computer use ) , physical activity ( determined using a vali date d question naire ) , and dietary habits ( intake frequency per week ) . Logistic regression was used to examine the associations between obesity and lifestyle factors . Results Compared with non-obese , obese males and females were significantly less active , especially in terms of vigorous activity , had less favorable dietary habits ( e.g. , lower intake of breakfast , fruits and milk ) , but had lower intake of sugar-sweetened drinks and sweets/chocolates . Logistic regression analysis showed that overweight/obesity ( based on BMI categories ) or abdominal obesity ( based on WHtR categories ) were significantly and inversely associated with vigorous physical activity levels ( aOR for high level = 0.69 , 95 % CI 0.41–0.92 for BMI and 0.63 , 95 % CI 0.45–0.89 for WHtR ) and frequency of breakfast ( aOR for < 3 days/week = 1.44 ; 95 % CI 1.20–1.71 for BMI and 1.47 ; 95 % CI 1.22–1.76 for WHtR ) and vegetable ( aOR for < 3 days/week = 1.29 ; 95 % CI 1.03–1.59 for WHtR ) intakes , and consumption of sugar-sweetened beverages ( aOR for < 3 days/week = 1.32 ; 95 % CI 1.08–1.62 for BMI and 1.42 ; 95 % CI 1.16–1.75 for WHtR ) . Conclusions The present study identified several lifestyle factors associated with obesity that may represent valid targets for the prevention and management of obesity among Saudi adolescents . Primary prevention of obesity by promoting active lifestyles and healthy diets should be a national public health priority OBJECTIVE To investigate the virtual importance of identified barriers to preventive interventions and to explore the association between physicians ' characteristics and their attitudes towards prevention . METHODS We conducted a cross-sectional survey of 182 r and omly selected family and general physicians ( 164/182=90 % response rate ) from total of 385 general physicians from 5 health sectors of Riyadh , Kingdom of Saudi Arabia in 2005 . A pre-tested question naire asking physicians to rate the general importance of 8 preventive health strategies was used . RESULTS The ranking of different preventive intervention varies from 124 ( 75.6 % ) for colorectal cancer screening to 155 ( 94.5 % ) for blood pressure control . Lack of time was rated an important barrier by 100 ( 61 % ) physicians , and lack of patient interest by 125 ( 76.2 % ) of physicians . There were 4 characteristics of physicians , which predicted negative attitudes toward prevention , sedentary lifestyle ( odds ratio [ OR ] = 3.4 , 95 % confidence interval [ CI ] , 1.1 - 11.1 ) , lack of awareness of their own blood pressure ( OR = 2.0 , 95 % CI , 1.0 - 3.9 ) , lack of training ( OR=2.2 , 95 % CI , 1.5 - 2.9 ) , and lack of evidence of benefits ( OR=1.98 , 95 % CI , 1.7 - 3.9 ) . CONCLUSION The influence of physicians ' own health behaviors and the importance of preventive intervention barriers , indicates a need for development of an approach to reduce the dominance of risky behavior OBJECTIVES To describe the physical activity profile of Saudi adults living in Riyadh , using the International Physical Activity Question naire ( IPAQ ) short-version telephone format . METHODS Physical activity was assessed using the official Arabic short form of IPAQ , intended for use in telephone interview . The instrument asks for times spent in walking , moderate- and vigorous-intensity physical activity of at least 10 min duration . The sample consisted of 1616 Saudis , between 15 and 78 years of age , living in Riyadh . Participants were drawn from a list of names in the telephone book using a simple r and om method . Telephone interviews were administered during the spring of 2003 by trained male interviewers . RESULTS The final sample size was 1064 Saudi males and females ( response rate of 66 % ) , with males comprising about 66 % of the respondents . Over 43 % of Saudis did not participate in any type of moderate-intensity physical activity lasting for at least 10 min . More than 72 % of the sample did not engage in any type of vigorous-intensity physical activity lasting for at least 10 min . The proportion of Saudis who walked for 150 min or more per week was 33.3 % . Females were engaged more in moderate physical activity than males , whereas males participated more in vigorous activity compared with females . Activity levels did not show significant relationships with education level or job hours per week . Based on the three activity categories established by IPAQ , 40.6 % of Saudis were inactive , 34.3 % were minimally active and 25.1 % were physically active . Physical inactivity increased with advancing age . CONCLUSION The data suggest that the prevalence of physical inactivity among Saudis adults is relatively high . Efforts are needed to encourage Saudis to be more physically active , with the goal of increasing the proportion of Saudis engaging in health-enhancing physical activity OBJECTIVE To assess the patterns and determinants of physical activity among Saudi adult males living in Riyadh . METHODS Self-administered question naires were filled out by 1333 r and omly selected Saudi males 19 years and older , during the Fall of 1996 . RESULTS Over 53 % of Saudi males were totally physically inactive , and another 27.5 % were irregularly active . Only 19 % of the entire sample were active on a regular basis . A curvilinear relationship was found between age and inactivity , with the middle age group the least active . Physical activity was lower among those who were married , work in the private sectors , working 2 shifts , less educated , or who had only one day off during the week . Time constraint seems to be the major contributing factor to inactivity , while maintaining health and losing weight were the most important reason for being physically active among Saudi males . CONCLUSION The proportion of Saudi males who are at risk for inactivity is very high . Indeed , it is exceedingly higher than those who are at risk for hypertension , hypercholesterolemia , obesity , or cigarette smoking . Public policies are needed to encourage active living and discourage sedentary habits . Health care providers have an important role in promoting physical activity among the population Background : The high prevalence of physical inactivity in Saudi Arabia is a growing challenge to public health . This study aim ed to examine the prevalence of physical activity ( PA ) and associated factors among female university students . Methods : This cross-sectional study involved 663 r and omly selected female university students who completed the Arab Teens Life Style question naire . Data included measurements of anthropometric , socioeconomic and environmental factors , as well as self-reported PA . Ordinal regression was used to identify associated factors with low , moderate and high PA levels . Results : The mean age of participants was 20.4 years ( SD 1.5 ) . Mean BMI of the students in relation to PA were 23.0 , 22.9 , 22.1 for high , moderate and low levels of activity , respectively . The analysis revealed significantly higher PA levels among married students , those with high educated mothers , and those who lived far from parks , and lower activity levels among underweight students . Conclusions : This study raises four important determinants for female university students ’ PA levels . These factors could be of great importance in the endeavor to prevent the health-threatening increase in physical inactivity patterns and thus non-communicable diseases and obesity where the focus should be on the specific situation and needs of women in Saudi Arabia Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more This study investigated the cross-cultural differences and similarity in health behaviors between Saudi and British adolescents . A school-based cross-sectional study was conducted at four cities in Saudi Arabia ( Riyadh and Al-Khobar ; N = 1,648 ) and Britain ( Birmingham and Coventry ; N = 1,158 ) . The participants ( 14–18 year-olds ) were r and omly selected using a multistage stratified cluster sampling technique . Measurements included anthropometric , screen time , vali date d physical activity ( PA ) question naire and dietary habits . The overweight/obesity prevalence among Saudi adolescents ( 38.3 % ) was significantly ( p < 0.001 ) higher than that found among British adolescents ( 24.1 % ) . The British adolescents demonstrated higher total PA energy expenditure than Saudi adolescents ( means ± SE = 3,804.8 ± 81.5 vs. 2,219.9 ± 65.5 METs-min/week ) . Inactivity prevalence was significantly ( p < 0.001 ) higher among Saudi adolescents ( 64 % ) compared with that of British adolescents ( 25.5 % ) . The proportions of adolescents exceeding 2 h of daily screen time were high ( 88.0 % and 90.8 % among Saudis and British , respectively ) . The majority of Saudi and British adolescents did not have daily intakes of breakfast , fruit , vegetables and milk . MANCOVA showed significant ( p < 0.05 ) gender by country interactions in several lifestyle factors . There was a significant ( p < 0.001 ) gender differences in the ratio of physical activity to sedentary behaviors . In conclusion , Saudi and British adolescents demonstrated some similarities and differences in their PA levels , sedentary behaviors and dietary habits . Unhealthy lifestyle behaviors among adolescents appear to be a cross-cultural phenomenon CONTEXT There is limited information on the effects of mechanical loading caused by physical activity ( PA ) on sclerostin , IGF-I , and bone turnover markers ( BTM ) . OBJECTIVE The objective of the investigation was to study the relationships between serum sclerostin , serum-IGF-I ( s-IGF-I ) , BTM , and the PA level in premenopausal women and to discern how 8 wk of PA training ( PAT ) affects the serum levels of sclerostin , IGF-I , and BTM . DESIGN This was a cross-sectional study with a subgroup followed up longitudinally . SETTING S AND SUBJECTS A total of 1235 r and omly selected premenopausal women were cross-sectionally studied . We also followed up 58 of these women longitudinally during an 8-wk course of PAT ( 4 d/wk ) and compared them with 62 controls . All women were medically examined , and bone mineral density ( BMD ) and serum levels of sclerostin , s-IGF-I , and BTM were determined . RESULTS Women with PA of greater than 120 min/wk showed significantly lower serum sclerostin ( by 36.8 % ) but higher s-IGF-I ( by 107 % ) levels than sedentary controls . Bone formation markers were also higher in the PA greater than 120 min/wk group compared with the sedentary controls . In the longitudinal study , the 8-wk PAT program led to a decrease in serum sclerostin ( by 33.9 % , P<0.0001 ) but increases in the serum levels of the bone-formation markers and IGF-I ( s-IGF-I by 74.2 % , P<0.0001 ) . CONCLUSIONS This study demonstrates that even minor changes in PA are associated with effects on serum levels of sclerostin , IGF-I , and BTM and suggests that sclerostin could be a link between mechanical loading and disuse osteoporosis in humans Even when adults meet physical activity guidelines , sitting for prolonged periods can compromise metabolic health . Television ( TV ) time and objective measurement studies show deleterious associations , and breaking up sedentary time is beneficial . Sitting time , TV time , and time sitting in automobiles increase premature mortality risk . Further evidence from prospect i ve studies , intervention trials , and population -based behavioral studies is required Obesity among pregnant women is becoming one of the most important women 's health issues . Obesity is associated with increased risk of almost all pregnancy complications : gestational hypertension , preeclampsia , gestational diabetes mellitus , delivery of large-for-GA infants , and higher incidence of congenital defects all occur more frequently than in women with a normal BMI . Evidence shows that a child of an obese mother may suffer from exposure to a suboptimal in utero environment and that early life adversities may extend into adulthood . In September 2009 , ILSI Europe convened a workshop with multidisciplinary expertise to review practice s and science base of health and nutrition of obese pregnant women , with focus on the long-term health of the child . The consensus viewpoint of the workshop identified gaps and gave recommendations for future research on gestational weight gain , gestational diabetes , and research method ologies . The evidence available on short- and long-term health impact for mother and child currently favors actions directed at controlling prepregnancy weight and preventing obesity in women of reproductive ages . More r and omized controlled trials are needed to evaluate the effects of nutritional and behavioral interventions in pregnancy outcomes . Moreover , suggestions that maternal obesity may transfer obesity risk to child through non-Mendelian ( e.g. epigenetic ) mechanisms require more long-term investigation OBJECTIVE To assess the nutrition and health status , nutrients intake , and physical activity among Saudi medical students . METHODS A cross-sectional survey using a question naire , anthropometric measurements , and laboratory assessment s was conducted from January to May 2011 on 194 r and omly selected Saudi medical students at Taibah University , Madinah , Kingdom of Saudi Arabia . The adequacy of nutrient intake was compared with the recommended daily intake ( RDI ) per the National Research Council . RESULTS Caloric intake was derived from carbohydrates ( 72.1 % ) , fats ( 19.4 % ) and proteins ( 8.4 % ) . Proteins and fats were obtained from a greater number of animal sources than of plant sources ( 5.3 % versus 3.2 % for proteins and 11.6 % versus 7.8 % for fats ) . There were low percentages of RDI of fibers ( 8.5 % ) , most vitamins especially vitamin D ( 14.2 % ) , and minerals ( potassium ( 31.3 % ) , zinc ( 40.7 % ) , magnesium ( 24.5 % ) , and calcium ( 47 % ) . Overall , 34.5 % of the students were overweight , and 10.3 % were obese . Dyslipidemia was diagnosed in 24.7 % , and 56.2 % had high high-sensitivity C-reactive protein ( hs-CRP ) . There was a positive correlation between the median caloric intake and both the BMI ( r=0.42 , p=0.00 ) and hs-CRP ( r=0.3 , p=0.001 ) . Inactivity was prevalent among the students ( 64.4 % ) . CONCLUSION This study showed deficiencies in several essential nutrients among medical students , and the prevalence of overweight status , obesity , and inactivity were relatively high . These results indicate the need to improve nutrition and promote healthy lifestyles among the medical students The aim of this study was to investigate whether body mass index ( BMI ) , eating habits and sedentary behaviours were associated with sleep duration among Kuwaiti adolescents . The study is part of the Arab Teens Lifestyle Study ( ATLS ) , which is a school-based cross-sectional multi-center collaborative study . A sample of 906 adolescents ( boys and girls ) aged 14 - 19 years was r and omly selected from 6 Kuwaiti Governances using a multistage stratified cluster sampling technique . The findings revealed that the prevalence of overweight and obesity was 50.5 % in boys and 46.5 % in girls . The majority of boys ( 76 % ) and of girls ( 74 % ) fell into the short sleep duration category ( 6 hours/day or less ) . Sleep duration were found to be negatively associated with BMI ( girls only ) . Watching television ( boys and girls ) and working on computers ( boys only ) were also negatively associated with sleep duration . While the consumption of breakfast ( both genders ) and milk ( boys only ) was positively associated with sleep duration ( p<0.05 ) . In contrast , the consumption of fast foods ( both genders ) , sugar-sweetened drinks and sweets ( boys only ) potatoes ( girls only ) were negatively associated with sleep duration ( p<0.05 ) . It can be concluded that the majority of Kuwaiti adolescents exhibit insufficient sleep duration which was associated with obesity measure , a combination of poor eating habits and more sedentary behaviors . The findings also suggest gender differences in these associations . Therefore , adequate sleep is an important modifiable risk factor to prevent obesity and was positively associated with some unhealthy lifestyle habits Background Few lifestyle factors have been simultaneously studied and reported for Saudi adolescents . Therefore , the purpose of the present study was to report on the prevalence of physical activity , sedentary behaviors and dietary habits among Saudi adolescents and to examine the interrelationships among these factors using representative sample s drawn from three major cities in Saudi Arabia . Methods This school-based cross-sectional study was conducted during the years 2009 - 2010 in three cities : Al-Khobar , Jeddah and Riyadh . The participants were 2908 secondary -school males ( 1401 ) and females ( 1507 ) aged 14 - 19 years , r and omly selected using a multistage stratified sampling technique . Measurements included weight , height , sedentary behaviors ( TV viewing , playing video games and computer use ) , physical activity using a vali date d question naire and dietary habits . Results A very high proportion ( 84 % for males and 91.2 % for females ) of Saudi adolescents spent more than 2 hours on screen time daily and almost half of the males and three-quarters of the females did not meet daily physical activity guidelines . The majority of adolescents did not have a daily intake of breakfast , fruit , vegetables and milk . Females were significantly ( p < 0.05 ) more sedentary , much less physically active , especially with vigorous physical activity , and there were fewer days per week when they consumed breakfast , fruit , milk and diary products , sugar-sweetened drinks , fast foods and energy drinks than did males . However , the females ' intake of French fries and potato chips , cakes and donuts , and c and y and chocolate was significantly ( p < 0.05 ) higher than the males ' . Screen time was significantly ( p < 0.05 ) correlated inversely with the intake of breakfast , vegetables and fruit . Physical activity had a significant ( p < 0.05 ) positive relationship with fruit and vegetable intake but not with sedentary behaviors . Conclusions The high prevalence of sedentary behaviors , physical inactivity and unhealthy dietary habits among Saudi adolescents is a major public health concern . There is an urgent need for national policy promoting active living and healthy eating and reducing sedentary behaviors among children and adolescents in Saudi Arabia Background : To formulate all intervention strategies for hypertension in the community , it is essential to quantify the magnitude of the disease and its risk factors . The patterns of physical activity have not been studied in terms of their being a risk factor or a predictor of hypertension in Saudi Arabia . Material s and Methods : This was a community-based cross-sectional study using the STEP-wise approach of adults and a multistage , stratified , cluster r and om sample . Data were collected using a question naire which included sociodemographics , blood pressure , patterns , levels and duration of physical activity . Results : Of a total of 4758 , 1213 ( 25.5 % ) were hypertensives . Hypertension was significantly negatively associated with total levels and duration of physical activity in leisure , transport , and work . Significant predictors of hypertension included lower levels of work involving a moderate physical activity for 10 min , walking/cycling for 10 min continuously , and vigorous activity during leisure time . Conclusions : Hypertension is prevalent among adults ; physical inactivity is a significant risk factor and predictor . Targeting this modifiable risk factor can help in prevention , early diagnosis , and control OBJECTIVE Dependence on motorized forms of transportation may contribute to the worldwide obesity epidemic . Shifts in transportation patterns occurring in China provide an ideal opportunity to study the association between vehicle ownership and obesity . Our objective was to determine whether motorized forms of transportation promote obesity . RESEARCH METHODS AND PROCEDURES A multistage r and om-cluster sampling process was used to select households from eight provinces in China . Data were included on household vehicle ownership and individual anthropometric and sociodemographic status . Cross-sectional data ( 1997 ) from 4741 Chinese adults aged 20 to 55 years were used to explore the association between vehicle ownership and obesity . Cohort data ( 1989 to 1997 ) from 2485 adults aged 20 to 45 years in 1989 ( 59 % follow-up ) were used to measure the impact of vehicle acquisition on the odds of becoming obese . RESULTS Our main outcome measure was current obesity status and the odds of becoming obese over an 8-year period . In 1997 , 84 % of adults did not own motorized transportation . However , the odds of being obese were 80 % higher ( p < 0.05 ) for men and women in households who owned a motorized vehicle compared with those who did not own a vehicle . Fourteen percent of households acquired a motorized vehicle between 1989 and 1997 . Compared with those whose vehicle ownership did not change , men who acquired a vehicle experienced a 1.8-kg greater weight gain ( p < 0.05 ) and had 2 to 1 odds of becoming obese . DISCUSSION Encouraging active forms of transportation may be one way to protect against obesity PURPOSE AND METHODS Although evidence is accumulating that sedentary behavior ( SB ) , independent of moderate-to-vigorous intensity physical activity ( MVPA ) , is associated with cardiometabolic and aging outcomes in adults , several gaps present opportunities for future research . This article reports on the " Research Evidence on Sedentary Behavior " session of the Sedentary Behavior : Identifying Research Priorities workshop , sponsored by the National Heart , Lung , and Blood Institute and the National Institute on Aging , which aim ed to identify priorities in SB research . RESULTS AND CONCLUSIONS A consensus definition of SB has not yet been established , although agreement exists that SB is not simply all behaviors other than MVPA . The two most common definitions are as follows : one based solely on intensity ( < 1.5 metabolic equivalents [ METs ] ) and another which combines low intensity ( ≤1.5 METs ) with a seated or reclining posture . Thus , for the definition of SB , evaluation of whether or not to include a postural component is a research priority . SB assessment method ologies include self-report and objective measurement , each offering distinct information . Therefore , evaluation , st and ardization , and comparison across self-report and objective assessment methods are needed . Specific priorities include the development and validation of novel devices capable of assessing posture and st and ardization of research practice s for SB assessment by accelerometry . The prospect i ve evidence that SB relates to health outcomes is limited in that SB is almost exclusively measured by self-report . The lack of longitudinal studies with objective ly measured SB was recognized as a major research gap , making examination of the association between objective ly measured SB and adverse health outcomes in longitudinal studies a research priority . Specifically , studies with repeated measures of SB , evaluating dose-response relationships , with inclusion of more diverse population s are needed Background : Determining the stages of change in physical activity ( PA ) helps to determine effective promotion of PA interventions . The aim of the study was to assess the readiness of students of King Saud University ( KSU ) to be more physically active and relate this to their self efficacy , perceived benefits and perceived barriers to PA . Method : A cross-sectional descriptive study was conducted at KSU , Riyadh , between March and May 2007 , using a self-administered question naire . The total sample size was 302 r and omly chosen male and female students . Results : More than half of the students ( 55.3 % ) reported that they participated in PA in the action ( for < 6 months ) and maintenance ( for ≥ 6 months ) stages . The remaining students did not engage in PA as they were in precontemplation , contemplation , preparation and relapse stages . More males were found in the maintenance stage , but more females were found in the inactive stages ( precontemplation , contemplation and preparation ) . Only 24.4 % of the students were engaged in PA as much as three times or more/week and 9.9 % engaged in PA regularly through out the year . About 39 % of which , significantly , more were males did vigorous PA for 20 minutes or more . However , 4.6 % described themselves as hyperactive . More females used the stairs , did house work and considered themselves moderately active . Nearly 50 % had a low total score of perceived barriers while 85 % had high perceived benefits and 63.6 % had moderate self efficacy of PA . The main barriers perceived were time and re sources . The score relating to barriers decreased significantly across stages of change , but the pattern was reversed with regard to the perceived benefits and self-efficacy ( p<0.05 ) . Conclusion : Physical inactivity is common among KSU students . A considerable proportion of them was not ready to become more physically active . The study highlights the need to adapt PA promotion programs to states of readiness for PA . University and public policies as well as environmental changes are necessary to encourage active living within the context of Islamic rules and Saudi culture This study was design ed to identify independent predictors of all osteoporosis-related fractures ( ORFs ) among healthy Saudi postmenopausal women . We prospect ively followed a cohort of 707 healthy postmenopausal women ( mean age , 61.3±7.2 years ) for 5.2±1.3 years . Data were collected on demographic characteristics , medical history , personal and family history of fractures , lifestyle factors , daily calcium intake , vitamin D supplementation , and physical activity score . Anthropometric parameters , total fractures ( 30.01 per 1000 women/year ) , special physical performance tests , bone turnover markers , hormone levels , and bone mineral density ( BMD ) measurements were performed . The final model consisted of seven independent predictors of ORFs : [ lowest quartile ( Q(1 ) ) vs highest quartile ( Q(4 ) ) ] physical activity score ( Q(1 ) vs Q(4 ) : ≤12.61 vs ≥15.38 ) ; relative risk estimate [ RR ] , 2.87 ; ( 95 % confidence interval [ CI ] : 1.88 - 4.38 ) ; age≥60 years vs age<60 years ( RR=2.43 ; 95 % CI : 1.49 - 3.95 ) ; h and grip strength ( Q(1 ) vs Q(4 ) : ≤13.88 vs ≥17.28 kg ) ( RR=1.88 ; 95 % CI : 1.15 - 3.05 ) ; BMD total hip ( Q(1 ) vs Q(4 ) : ≤0.784 vs 0.973 g/cm(2 ) ) ( RR=1.86 ; 95 % CI : 1.26 - 2.75 ) ; dietary calcium intake ( Q(1 ) vs Q(4 ) : ≤391 vs ≥648 mg/day ) ( RR=1.66 ; 95 % CI : 1.08 - 2.53 ) ; serum 25(OH)D ( Q(1 ) vs Q(4 ) : ≤17.9 vs ≥45.1 nmol/L ) ( RR=1.63 ; 95 % CI : 1.06 - 2.51 ) ; and past year history of falls ( RR=1.61 ; 95 % CI : 1.06 - 2.48 ) . Compared with having none ( 41.9 % of women ) , having three or more clinical risk factors ( 4.8 % of women ) increased fracture risk by more than 4-fold , independent of BMD . Having three or more risk factors and being in the lowest tertile of T-score of [ total hip/lumbar spine ( L1-L4 ) ] was associated with a 14.2-fold greater risk than having no risk factors and being in the highest T-score tertile . Several clinical risk factors were independently associated with all ORFs in healthy Saudi postmenopausal women . The combination of multiple clinical risk factors and low BMD is a very powerful indicator of fracture risk AIMS Insofar as hypertension is a risk factor for cardiovascular morbidity and mortality in Type 2 diabetes mellitus ( T2DM ) , this study investigated the incidence of hypertension and associated risk factors in Saudi T2DM patients . METHODS A hospital-based , 11-year ( 1993 - 2004 ) prospect i ve study of 916 adult originally normotensive T2DM Saudi patients ( 488 male and 428 female ) . T2DM was diagnosed as per World Health Organization ( WHO ) criteria , while hypertension was assessed according to the Seventh Joint National Criteria for Hypertension Classification ( JNVII ) . Risk factors were analyzed on those who developed hypertension . RESULTS The hypertension incidence was 17.2/100 person-years , based on 2833.63 person-years of cohort group follow-up . Age-adjusted Cox regression coefficient showed that the significant risk factors for developing hypertension were older age , higher HbA(1c ) , BMI , elevated triglycerides ( > 1.8 mmol/l ) and creatinine ( > 115 mmol/l ) , smoking , proteinuria , microalbuminuria , lack of physical exercise , and retinopathy , while anti-platelet and lipid-lowering drugs had lower hypertension hazard ratios . Cox proportional hazard showed that older age , male gender , higher BMI , diabetes duration ( < 5 years ) , and retinopathy were independent predictors of hypertension , while exercise , lipid-lowering , and anti-platelet medications were associated with reduced hypertension incidence rate . CONCLUSIONS Incidence of hypertension in Saudi T2DM patients is comparable to other communities , with older age , male gender , higher BMI , diabetes duration of < 5 years and retinopathy being strong predictors for hypertension development Objectives : To estimate the prevalence and determinants of obesity in childhood and adolescence and their association with blood pressure ( BP ) in Abu Dhabi , United Arab Emirates ( UAE ) . Design : A cross-sectional population -representative study .Subjects : A total of 1541 students ( grade s 1–12 ; aged 6–19 years ) were r and omly selected from 246 schools ( 50 % male ) . Anthropometric and demographic variables were measured , and CDC criteria were used to classify children ’s weights . Results : A total of 1440 ( 93 % ) students provided complete results . Crude prevalences were : 7.6 % underweight , 14.7 % overweight and 18.9 % obesity . Further analyses were restricted to UAE nationals ( n=1035 ) , of whom these figures were : 8.3 % underweight ( females 6.5 % , males 10.1 % ; P=0.06 ) , 14.2 % overweight ( females 16.7 % , males 11.6 % ; P<0.01 ) , 19.8 % obesity ( females 18.1 % , males 21.4 % ; P=0.09 ) . Obesity significantly ( P<0.001 ) increased with age . The majority ( 61.3 % ) of students had body mass index ( BMI ) percentiles above the 50th CDC percentile . Stepwise linear regression of BMI percentile on age , sex , dairy consumption , exercise and family income showed a significant ( P<0.01 ) positive association with age and lack of dairy consumption , but not exercise and income . BP significantly ( P<0.01 ) increased with BMI percentile . Conclusions : The prevalence of childhood obesity is high across the age spectrum in the UAE . Older age , male sex , lack of dairy intake and higher parental BMI , are independent determinants of childhood obesity in this population . Higher BMI percentile is associated with a higher BP . Prevention strategies should focus on younger children , particularly children of obese parents . Longitudinal studies are needed to investigate trends and the impact of childhood obesity on the risk of non-communicable diseases BACKGROUND Underst and ing the inter-relationships between lifestyle factors in youth is important with respect to the development of effective promotional programmes for healthy eating and active living . The present study aim ed to explore the associations of dietary habits ( DH ) with physical activity ( PA ) and screen time ( ST ) among Saudi adolescents aged between 15 and 19 years of age relative to gender . METHODS Data were obtained from the Arab Teens Lifestyle Study , a school-based multicentre lifestyle study conducted in 2009/2010 in three major cities in Saudi Arabia . A multistage stratified cluster r and om sampling technique was used . The number of participants with complete data for DH and PA was 2886 and the respective number for DH and ST was 2822 . Assessment included weight , height , body mass index , total daily ST ( television viewing , video/computer games and Internet use ) , PA and DH using self-reported question naires . RESULTS Females were significantly more sedentary and less active than males ( P < 0.001 ) . Two-way analysis of covariance , controlling for age , showed significant ( P < 0.05 ) gender by PA and gender by ST interactions for several DH . Logistic regression analyses revealed significant associations of higher PA with a higher consumption of fruit , vegetables , milk , French fries/potato chips and energy drinks , whereas higher ST was significantly associated with a higher consumption of sugar-sweetened drinks , fast foods , cake/doughnuts and energy drinks . CONCLUSIONS Healthful dietary habits were associated mostly with PA , whereas sedentary behaviours , independent of PA , negatively impacted upon eating behaviours . The low PA levels and high sedentary levels of Saudi females represent a great concern . The results reported in the present study have important implication s for both youth public health policies and intervention programmes OBJECTIVE To assess physical activity levels among Saudi adults , and to examine the relationships of physical activity with body mass index ( BMI ) , waist circumference ( WC ) and obesity prevalence . METHODS Data taken from the Coronary Artery Disease in Saudis Study which is a National Epidemiological Health Survey carried out between 1995 and 2000 . Participants included 17395 Saudi males and females aged 30 - 70 years , selected r and omly using a multistage stratified cluster sampling technique . Leisure-type and sport-related physical activities including walking were assessed using an interviewed-administered question naire . The activities were classified into five intensity categories and assigned metabolic equivalents ( MET ) according to the compendium of physical activity . Based on the intensity , duration and frequency of physical activity , subjects were classified into active or inactive category . RESULTS Inactivity prevalence ( 96.1 % ) was very high . There were significantly ( p<0.001 ) ) more inactive females ( 98.1 % ) than males ( 93.9 % ) . Inactivity prevalence increases with increasing age category , especially in males , and decreases with increasing education levels . Inactivity was the highest in the central region ( 97.3 % ; 95 % CI = 96.8 - 97.8 % ) and the lowest in the southern region of Saudi Arabia ( 94.0 % ; 95 % CI = 93.2 - 94.8 % ) . Further , active individuals exhibited lower values of BMI and WC . CONCLUSION These findings reveal the sedentary nature of Saudi population . The overwhelming majority of men and women did not reach the recommended physical activity levels necessary for promoting health and preventing diseases . The high prevalence of inactivity among Saudis represents a major public health concern OBJECTIVE To evaluate the life style and dietary habits of school students and the prevalence of some nutritional problems . METHODS We conducted this study in Abha city during the scholastic year 2000 . A two-stage r and om sample was used to select the students . The sample consisted of 767 male and female students in different grade s of education . A design ed question naire was used to collect data regarding life style practice s and dietary habits . Weight , height , and body mass index were obtained . RESULTS Diets were rich in carbohydrates , and deficient in fiber . Breakfast was a regular meal for 72 % of primary school students compared to 49 % of secondary school students . Milk was consumed daily by 51.5 % of the sample ; fast food consumption was low ( 2.0 + /- 1.7 times/month ) . Physical exercise was practice d significantly longer by males than by females ; 8.6 % and 5.8 % of males in intermediate and secondary grade s were smokers . Sleeping hours during school days were adequate ( 7.4 + /- 1.7 hours/day ) , but relatively higher ( 9.5 + /- 2.3 ) during vacation . Underweight ( 18.9 % ) , obesity ( 15.9 % ) , and overweight ( 11 % ) were prevalent . Overweight and obesity were significantly more prevailing among females of primary and secondary grade s. CONCLUSION Health education and physical education programs in the schools are recommended to promote healthy life styles and dietary habits . School feeding programs may be required to achieve some of these goals BACKGROUND Lifestyle factors are important determinants of adequate sleep among adolescents . However , findings on sleep duration relative to lifestyle factors are conflicting . Therefore , this study examined the association of self-reported sleep duration with physical activity , sedentary behaviours and dietary habits among Saudi adolescents . METHODS A multicentre school-based cross-sectional study was conducted in three major cities in Saudi Arabia . The sample included 2868 secondary -school students ( 51.9 % girls ) aged 15 - 19 years , r and omly selected using a multistage stratified cluster sampling technique . In addition to anthropometric measurements , sleep duration , physical activity , sedentary behaviours and dietary habits were assessed using self-reported question naire . RESULTS Several lifestyle factors were associated with sleep duration in adolescents . While controlling for some potential confounders , the findings showed that high screen time [ > 5 h/day ; adjusted odds ratio ( aOR ) = 1.505 , 95 % confidence interval ( CI ) = 1.180 - 1.920 , P = 0.001 ] and low ( aOR = 1.290 , 95 % CI = 1.064 - 1.566 , P = 0.010 ) to medium ( aOR = 1.316 , 95 % CI = 1.075 - 1.611 , P = 0.008 ) physical activity levels were significantly related to daily sleep of 8 h or longer . Furthermore , having low intake of breakfast ( <3 day/week compared with 5 days or more per week ) decreased the odd of having adequate sleep duration by a factor of 0.795 ( 95 % CI = 0.667 - 0.947 , P < 0.010 ) . CONCLUSIONS Short sleep duration ( < 8 h/day ) among Saudi adolescents 15 - 19 year olds was significantly associated with several lifestyle factors . Intervention programs aim ing for improving sleeping habits among adolescents need to consider such potential association of lifestyle variables with sleep duration Background The increasing rate of obesity among Kuwaiti adolescents is associated with immediate and long term-risks to their health and well-being . Objective To up date data on the prevalence of overweight and obesity among Kuwaiti adolescents and to examine the relative contribution of selected lifestyle factors to overweight and obesity in this population . Methods The present study is part of the Arab Teens Lifestyle Study ( ATLS ) . A total of 906 adolescents ( 463 boys and 443 girls ) aged between 14 and 19 years were selected from Kuwaiti schools by a multistage stratified r and omization process . A vali date d question naire was used to collect data on physical activity , sedentary lifestyle , and eating habits . The International Obesity Task Force ( IOTF ) cutoff values for adolescents under 18 years of age were used to define overweight and obesity . Total energy expenditure was calculated using metabolic equivalent-minutes per week . A general linear model was used to establish the proportion of the variance ( expressed in partial eta squared ) in excess weight attributable to differences in eating habits and physical activity . Results The prevalence of overweight and obesity was 50.5 % in boys and 46.5 % in girls . Among boys , moderate and vigorous activities were found to be significantly negatively associated with overweight and obesity ( p < .05 ) , whereas in girls , only those with not less than moderate activities were negatively associated with overweight and obesity ( p < .05 ) . Sedentary behaviors , time spent watching television , and time spent working on the computer were not significantly associated with obesity in either sex . Consumption of breakfast , vegetables , and fast foods ( boys and girls ) and potatoes , cakes and doughnuts , and sweets ( girls only ) was significantly associated with overweight and obesity ( p < .05 ) . In general , the partial eta square explained by physical activity was less than 3.6 % in boys compared with less than 1.0 % in girls , and eating habits explained less than 1.8 % in boys compared with 2.5 % in girls . Conclusions Physical activity explains a greater proportion of variation in body mass index than do eating habits , particularly in boys . Eating habits explain a greater proportion of variation in body mass index than does physical activity in girls . Prospect i ve studies are needed to clarify the relative effects of sedentary behaviors on overweight in adolescents OBJECTIVES The aim of the study was to examine the gender differences and predictors of leisure versus non-leisure time physical activities among Saudi adolescents aged 14 - 19 years . MATERIAL S AND METHOD The multistage stratified cluster r and om sampling technique was used . A sample of 1,388 males and 1,500 females enrolled in secondary schools in three major cities in Saudi Arabia was included . Anthropometric measurements were performed and Body Mass Index was calculated . Physical activity , sedentary behaviours and dietary habits were measured using a self-reported vali date d question naire . RESULTS The total time spent in leisure and non-leisure physical activity per week was 90 and 77 minutes , respectively . The males spent more time per week in leisure-time physical activities than females . Females in private schools spent more time during the week in leisure-time physical activities , compared to females in Stateschools . There was a significant difference between genders by obesity status interaction in leisure-time physical activity . Gender , and other factors , predicted total duration spent in leisure-time and non-leisure-time physical activity . CONCLUSIONS The study showed that female adolescents are much less active than males , especially in leisure-time physical activities . Programmes to promote physical activity among adolescents are urgently needed , with consideration of gender differences T 2 diabetes mellitus ( T2DM ) is a metabolic disease affecting the metabolism of carbohydrates , proteins and lipids due to initial insulin resistance at the cellular levels followed by progressive insulin deficiency . The consequence of that is gradual increment of serum blood glucose , which binds independent of the enzymes to the proteins in the blood , as well as the tissue proteins ; one of these measurable protein is the red blood cells hemoglobin , which enabled the clinician to assess diabetes control by the percentage of glycosylation of this protein every 3 months , which correlates well to the other tissue involvement and the disease control . Diabetes-related complications are related to the micro and macro vascular derangement due to glycosylated protein in these vessels tissues , leading to nephropathy , retinopathy , neuropathy , myocardial infa rct ion , stroke , and death found to be more frequent and seen at earlier age.1 Estimate of global T2DM indicates that diabetes now affects 246 million people worldwide , and is expected to affect approximately 380 million by 2025 , representing 7.1 % of the world adult population .2 The Kingdom of Saudi Arabia ( KSA ) has one of the highest prevalence of T2DM worldwide since the percentage of diabetes mellitus is estimated to be 23.7 % of the overall population .3 Structured exercise interventions have been shown to be at least as efficacious as the pharmaceutical agents for improving glycemic control.4 Most people with T2DM do not do enough physical activity to benefit their health , and in comparison with the general population , people with diabetes experience a higher frequency of relapse to sedentary behavior.5 Our primary objective is to investigate the effects of supervised aerobic exercise on glycemic control , the glycosylated hemoglobin ( HbA1c ) level , as well as the effect on medication doses adjustments , before and after the study in T2DM Saudi male patients . Our secondary objective was to assess its effect on their lipid profile . This study was performed as collaboration between the Department of Physical Education at the King Fahd University of Petroleum and Minerals ( KFUPM ) and King Fahd Hospital of the University ( KFHU ) , University of Dammam from July to October 2012 . Fifty male patients registered in the diabetes clinic records at KFHU , Al-Khobar , KSA were r and omly selected using the stratified r and om sampling and pooled in the study for inclusion and exclusion criteria filtering . Twenty-four patients diagnosed with T2DM , age range of 25 - 55 years ( 43.86±10.3 years ) were selected to be enrolled in the study . Their medication profiles were documented at the beginning and at the end of the program from their medical files . Diet and lifestyle of patients were not controlled by the study . We excluded patients known to have cardiovascular disease , hypertension , retinopathy , neuropathy , bronchial asthma , arthritis , and morbidly obese . We also excluded those who were involved in any physical exercise program or subjects living very far from KFUPM , the site of the physical exercise . After initial screening and baseline assessment s , all 24 subjects were r and omly allocated to 2 different groups namely ; intervention group ( n=12 ) and control group ( n=12 ) . The exercise group subjects signed a written informed consent after being informed regarding all possible risks associated with exercise training programme . Comprehensive clinical examination , as well as laboratory testing and fitness assessment were performed for each subject before participation in the exercise training programme , as well as at the end of the programme . All subjects went through one week of familiarization before the start of the programme . After a few weeks , 2 subjects withdrew from the intervention group due to personal reasons . Only 10 subjects completed the 12 weeks of the exercise training programme . The exercise training programme was given at the fitness centre of KFUPM . The entire duration of the 12-week exercise training programme was closely supervised by 2 qualified personnel . Attendance was verified every day through direct observation and documented records . Exercise data were recorded every day on a chart . The exercise session includes running or walking on treadmill , and cycling on bicycle ergometer . We provided 10 minutes of warming up period and 10 minutes of cooling down period in each exercise session . Each participant had to check his blood glucose levels before and after the exercise . Subjects performed exercise 3 times per week on alternate days with no more than a 2-day gap . The training was design ed to provide an alternate day as a replacement for missed training sessions during the week . Subjects started with training intensity of 40 - 50 % of maximum heart rate Brief OBJECTIVE To assess the levels of adiposity and physical activity among Saudi preschool children from Jeddah . METHODS Participants included 224 Saudi preschool children , r and omly selected from public and private preschools in Jeddah during April and May of 2006 , using a multistage stratified sampling technique . Measurements included weight , height , body mass index , triceps and subscapular skinfolds , fat percentage , fat mass ( FM ) , fat-free mass ( FFM ) , FM index ( FMI ) and FFM index ( FFMI ) , time spent watching television and physical activity levels using electronic pedometer for 3 continuous days during weekdays . RESULTS The fat content averaged 20.6 % of body weight , while the prevalence of obesity was 10.8 % . There were significant gender differences in fat percentage , FM , FFM , FMI , and FFMI . The mean value for pedometer-determined steps counts for the preschool children was 6773.2 steps per day . Boys were significantly more active than girls . Only 22.4 % of the preschool children had 10,000 steps or more per day . There were no significant age differences in skinfolds measurements , fat percentage , FMI , FFMI , central obesity or daily steps counts . Television viewing time increased by 22.5 % from age 4 to age 6 . Compared to non-obese , obese preschool children were significantly heavier , taller and had higher values for all adiposity indices and television viewing time . CONCLUSION A considerable proportion of Saudi preschool children is obese and even a greater proportion is physically inactive . Obesity and physical inactivity represent major risks for a number of non-communicable diseases , and an early intervention is most appropriate The Arab Teens Lifestyle Study ( ATLS ) is a multicenter project for assessing the lifestyle habits of Arab adolescents . This study reports on the convergent validity of the physical activity question naire used in ATLS against an electronic pedometer . Participants were 39 males and 36 females r and omly selected from secondary schools , with a mean age of 16.1 ± 1.1 years . ATLS self-reported question naire was vali date d against the electronic pedometer for three consecutive weekdays . Mean steps counts were 6,866 ± 3,854 steps/day with no significant gender difference observed . Question naire results showed no significant gender differences in time spent on total or moderate-intensity activities . However , males spent significantly more time than females on vigorous-intensity activity . The correlation of steps counts with total time spent on all activities by the question naire was 0.369 . Relationship of steps counts was higher with vigorous-intensity ( r = 0.338 ) than with moderate-intensity activity ( r = 0.265 ) . Pedometer steps counts showed higher correlations with time spent on walking ( r = 0.350 ) and jogging ( r = 0.383 ) than with the time spent on other activities . Active participants , based on pedometer assessment , were also most active by the question naire . It appears that ATLS question naire is a valid instrument for assessing habitual physical activity among Arab adolescents Background And Objectives : The occurrence and progress of chronic non-communicable diseases ( NCDs ) is associated with unhealthy lifestyles and behaviors . Modification of barriers to healthy lifestyle can produce great benefits . The objective of this study was to identify barriers to physical activity and healthy eating among patients attending primary health care clinics in Riyadh city . Patients and Methods : A cross-sectional study was conducted at King Khalid University Hospital ( KKUH ) in Riyadh city . Four hundred and fifty participants attending primary health care clinics ( PHCC ) from 1 March to 30 April 2007 were r and omly selected . A question naire about barriers to physical activity and healthy eating was adapted from the CDC web site . Results : The prevalence of physical inactivity among the Saudi population in the study was 82.4 % ( 371/450 ) . Females were more physically inactive ( 87.6 % , 268/306 ) compared to males ( 71.5 % , 103/144 ) ( P < .001 ) . The most common barrier to physical activity was lack of re sources ( 80.5 % , 326/405 ) , which was significantly higher among females than males and among the lower income versus the higher income group . The most common barrier to healthy diet was lack of willpower . More than four-fifths ( 80.3 % , 354/441 ) of the study group stated that they did not have enough will to stick to a diet . Conclusion : Lack of re sources was the most important barrier for physical activity , while lack of willpower and social support were both barriers for adherence to physical activity and a healthy diet OBJECTIVE To provide a current estimate of the prevalence and determining factors associated with physical activity among patients attending family medicine clinics in western Saudi Arabia . METHODS A cross-sectional study was conducted using an interview-administered question naire completed by 329 r and omly selected adult Saudi male and female patients attending family medicine clinics at the Armed Forces Hospitals , Taif , Saudi Arabia . The study was conducted from December 2005 to January 2006 . RESULTS Approximately 54 % of the participants were physically active , of whom 27.7 % were practicing vigorous physical activities , and 72.3 % were practicing moderate physical activities . Multivariate analysis showed that age , occupation , chronic health problems , and fear of criticism were significantly associated with practicing physical activity . CONCLUSION Patients , as targets of health services need support and provision of facilities for appropriate underst and ing and practicing of physical activity . A national policy that encourages active living and discourages sedentary habits is also needed , and health care providers should play an important role in this policy BACKGROUND Non-adherence to preventive and therapeutic lifestyle recommendations among patients at high risk of cardiovascular disease is more prevalent and varied than previously thought . The problem needs to be addressed by those who are involved in the care of these patients . AIM To measure adherence and barriers of complying with lifestyle recommendations among patients with high cardiovascular risk factors . DESIGN OF STUDY Prospect i ve study . SETTING Six family- practice health centres in Kuwait . METHOD Data are from 334 Kuwaiti adult males and females with hypertension , type 2 diabetes , or both , who completed a routine clinic visit in one of six family practice centres . Trained staff used a structured question naire to obtain a detailed medical history regarding exercise habits and barriers to compliance with diet and exercise programmes . Clinical criteria assessed were height , weight , and the control of blood pressure and blood sugar . RESULTS From the study sample , 63.5 % of patients reported that they were not adhering to any diet regimen , 64.4 % were not participating in regular exercise , and 90.4 % were overweight and obese . The main barriers to adherence to diet were unwillingness ( 48.6 % ) , difficulty adhering to a diet different from that of the rest of the family ( 30.2 % ) , and social gatherings ( 13.7 % ) . The main barriers to adherence to exercise were lack of time ( 39.0 % ) , coexisting diseases ( 35.6 % ) , and adverse weather conditions ( 27.8 % ) . Factors interfering with adherence to lifestyle measures among the total sample were traditional Kuwaiti food , which is high in fat and calories ( 79.9 % ) , stress ( 70.7 % ) , a high consumption of fast food ( 54.5 % ) , high frequency of social gatherings ( 59.6 % ) , abundance of maids ( 54.1 % ) , and excessive use of cars ( 83.8 % ) . CONCLUSION The majority of individuals in the sample were overweight , did not engage in recommended levels of physical activity , and did not follow dietary recommendations . Additional cultural and demographic variables need to be considered to improve adherence to lifestyle measures
11,009
19,190,099
The findings in this review suggest that patient preferences with regard to the communication of bad news by physicians consist of four components : setting , manner of communicating bad news , what and how much information is provided and emotional support , and that patients ' preferences are associated with demographic factors . Younger patients , female patients and more highly educated patients consistently expressed a desire to receive as much detailed information as possible and to receive emotional support . Asian patients were shown to prefer that relatives be present when receiving bad news more than Westerners do and to prefer to discuss their life expectancy less than Westerners .
Most physicians regard the communication of bad news to be a difficult issue in clinical oncology practice . The optimal manner of communicating bad news to patients so that physicians can create maximal underst and ing in patients and facilitate their psychological adjustment is unknown . A systematic review of the literature was conducted to clarify available knowledge on patient preferences regarding the communication of bad news and associated factors . The above issue has been discussed mainly in Western countries .
Active participation in the medical consultation has been demonstrated to benefit aspects of patients ’ subsequent psychological well-being . We investigated two interventions promoting patient question -asking behaviour . The first was a question prompt sheet provided before the consultation , which was endorsed and worked through by the clinician . The second was a face to face coaching session exploring the benefits of , and barriers to , question -asking , followed by coaching in question -asking behaviour employing rehearsal techniques . Sixty patients with heterogeneous cancers , seeing two medical oncologists for the first time , were r and omly assigned to one of three groups : two intervention groups and one control group . Sociodemographic variables and anxiety were assessed prior to the intervention which preceded the consultation . The consultations were audiotaped and subsequently analysed for question -asking behaviour . Anxiety was assessed again immediately following the consultation . Question naires to assess patient satisfaction , anxiety and psychological adjustment were sent by mail 2 weeks following the consultation . Presentation and discussion of the prompt sheet significantly increased the total number of questions asked and the number of questions asked regarding tests and treatment . Coaching did not add significantly to the effects of the prompt sheet . Psychological outcomes were not different among the groups . We conclude that a question prompt sheet addressed by the doctor is a simple , inexpensive and effective means of promoting patient question asking in the cancer consultation Patient participation in medical consultations has been demonstrated to benefit their subsequent psychological well being . Question asking is one way in which patients can be active . We investigated 2 means of promoting cancer patient question asking . One was the provision of a question prompt sheet to patients prior to their initial consultation with their oncologist . The second was the active endorsement and systematic review of the question prompt sheet by their oncologist . 318 patients with heterogeneous cancers , seeing one of 5 medical and 4 radiation oncologists for the first time , were r and omised to either receive or not receive a question prompt sheet . Doctors were r and omised to either proactively address or passively respond to the question prompt sheet in the subsequent consultation . Anxiety was assessed prior to the consultation . Consultations were audiotaped and content analysed . Anxiety was assessed again immediately following the consultation . Within the next 10 days patients completed question naires assessing information needs , anxiety and satisfaction and were given a structured telephone interview assessing information recall . Patients provided with a question prompt sheet asked more questions about prognosis compared with controls and oncologists gave significantly more prognostic information to these patients . Provision of the question prompt sheet prolonged consultations and increased patient anxiety ; however , when oncologists specifically addressed the prompt sheet , anxiety levels were significantly reduced , consultation duration was decreased and recall was significantly improved . A patient question prompt sheet , used proactively by the doctor , is a powerful addition to the oncology consultation . © 2001 Cancer Research Campaign BACKGROUND The aims of this study were to ( i ) conceptualize dimensions of a good death in Japanese cancer care , ( ii ) clarify the relative importance of each component of a good death and ( iii ) explore factors related to an individual 's perception of the domains of a good death . METHODS The general population was sample d using a stratified r and om sampling method ( n = 2548 ; response rate , 51 % ) and bereaved families from 12 certified palliative care units were surveyed as well ( n = 513 ; 70 % ) . We asked the subjects about the relative importance of 57 components of a good death . RESULTS Explanatory factor analysis demonstrated 18 domains contributing to a good death . Ten domains were classified as ' consistently important domains ' , including ' physical and psychological comfort ' , ' dying in a favorite place ' , ' good relationship with medical staff ' , ' maintaining hope and pleasure ' , ' not being a burden to others ' , ' good relationship with family ' , ' physical and cognitive control ' , ' environmental comfort ' , ' being respected as an individual ' and ' life completion ' . CONCLUSIONS We quantitatively identified 18 important domains that contribute to a good death in Japanese cancer care . The next step of our work should be to conduct a national survey to identify what is required to achieve a good death BACKGROUND Active participation and asking questions are important ways in which patients can ensure they underst and what the doctor has said . This study evaluated a question prompt sheet design ed to encourage patients to ask questions in the cancer consultation . PATIENTS AND METHODS Patients ( n = 142 ) were r and omised to receive ( i ) a question prompt sheet or ( ii ) a general sheet informing patients of services available through the regional Cancer Council . Recall of information was assessed in a structured interview 4 - 20 days after the consultation . Question naires to assess patient satisfaction and adjustment to cancer were sent by mail . RESULTS The question prompt sheet had a significant effect in one content area : prognosis . Thirty-five percent of patients who received the question h and out asked questions about prognosis compared to 16 % of those receiving the information h and out . The prompt sheet did not increase the mean number of questions asked overall . Age , in/out-patient status , gender and involvement preference were predictive of both number and duration of patient questions . CONCLUSIONS A question prompt sheet has a limited but important effect on patient question asking behaviour in the cancer consultation PURPOSE This study evaluated a cancer consultation preparation package ( CCPP ) design ed to facilitate patient involvement in the oncology consultation . PATIENTS AND METHODS A total of 164 cancer patients ( 67 % response rate ) were r and omly assigned to receive the CCPP or a control booklet at least 48 hours before their first oncology appointment . The CCPP included a question prompt sheet , booklets on clinical decision making and patient rights , and an introduction to the clinic . The control booklet contained only the introduction to the clinic . Physicians were blinded to which intervention patients received . Patients completed question naires immediately after the consultation and 1 month later . Consultations were audiotaped , transcribed verbatim , and coded . RESULTS All but one patient read the information . Before the consultation , intervention patients were significantly more anxious than were controls ( mean , 42 v 38 ; P = .04 ) ; however anxiety was equivalent at follow-up . The CCPP was reported as being significantly more useful to family members than the control booklet ( P = .004 ) . Patients receiving the intervention asked significantly more questions ( 11 v seven questions ; P = .005 ) , tended to interrupt the physician more ( 1.01 v 0.71 interruptions ; P = .08 ) , and challenged information significantly more often ( twice v once ; P = .05 ) . Patients receiving the CCPP were less likely to achieve their preferred decision making style ( 22 % ) than were controls ( 35 % ; P = .06 ) . CONCLUSION This CCPP influences patients ' consultation behavior and does not increase anxiety in the long-term . However , this intervention , without physician endorsement , reduced the percentage of patients whose preferred involvement in decision making was achieved
11,010
23,083,405
Conclusions We found weak evidence for a prospect i ve association between the PCR and adolescent alcohol use .
Background Alcohol use among adolescents has become a major public health problem in the past decade and has large short- and long-term consequences on their health . The aim of this systematic review was to provide an overview of longitudinal cohort studies that have analyzed the association between the parent – child relationship ( PCR ) and change in alcohol use during adolescence .
Prognosis studies are investigations of future events or the evaluation of associations between risk factors and health outcomes in population s of patients ( 1 ) . The results of such studies improve our underst and ing of the clinical course of a disease and assist clinicians in making informed decisions about how best to manage patients . Prognostic research also informs the design of intervention studies by helping define subgroups of patients who may benefit from a new treatment and by providing necessary information about the natural history of a disorder ( 2 ) . There has recently been a rapid increase in the use of systematic review methods to synthesize the evidence on research questions related to prognosis . It is essential that investigators conducting systematic review s thoroughly appraise the method ologic quality of included studies to be confident that a study 's design , conduct , analysis , and interpretation have adequately reduced the opportunity for bias ( 3 , 4 ) . Caution is warranted , however , because inclusion of method ologically weak studies can threaten the internal validity of a systematic review ( 4 ) . This follows abundant empirical evidence that inadequate attention to biases can cause invalid results and inferences ( 5 - 9 ) . However , there is limited consensus on how to appraise the quality of prognosis studies ( 1 ) . A useful framework to assess bias in such studies follows the basic principles of epidemiologic research ( 10 , 11 ) . We focus on 6 areas of potential bias : study participation , study attrition , prognostic factor measurement , confounding measurement and account , outcome measurement , and analysis . The main objectives of our review of review s are to describe methods used to assess the quality of prognosis studies and to describe how well current practice s assess potential biases . Our secondary objective is to develop recommendations to guide future quality appraisal , both within single studies of prognostic factors and within systematic review s of the evidence . We hope this work facilitates future discussion and research on biases in prognosis studies and systematic review s. Methods Literature Search and Study Selection We identified systematic review s of prognosis studies by search ing MEDLINE ( 1966 to October 2005 ) using the search strategy recommended by McKibbon and colleagues ( 12 ) . This strategy combines broad search terms for systematic review s ( systematic review .mp ; meta- analysis .mp ) and a sensitive search strategy for prognosis studies ( cohort , incidence , mortality , follow-up studies , prognos * , predict * , or course ) . We also search ed the reference lists of included review s and method ologic papers to identify other relevant publications . We restricted our search to English- language publications . One review er conducted the search and selected the studies . Systematic review s , defined as review s of published studies with a comprehensive search and systematic selection , were included if they assessed the method ologic quality of the included studies by using 1 or more explicit criteria . We excluded studies if they were meta-analyses of independent patient data only , if their primary goal was to investigate the effectiveness of an intervention or specific diagnostic or screening tests , or if they included studies that were not done on humans . Data Extraction and Synthesis Individual items included in the quality assessment of the systematic review s were recorded as they were reported in the publication ( that is , the information that would be available to readers and future review ers ) . We review ed journal Web sites and contacted the authors of the systematic review s for additional information when authors made such an offer in their original papers . When review s assessed different study design s by using different sets of quality items , we extracted only those items used to assess cohort studies . We constructed a comprehensive list of distinct items that the review s used to assess the quality of their included studies . The full text of each review was screened . All items used by the review authors to assess the quality of studies were extracted into a computerized spreadsheet by 1 review er . Two experienced review ers , a clinical epidemiologist and an epidemiologist , independently synthesized the quality items extracted from the prognosis review s to determine how well the systematic review s assessed potential biases . We did this in 3 steps : 1 ) identified distinct concepts or domains addressed by the quality items ; 2 ) grouped each extracted quality item into the appropriate domain or domains ; and 3 ) identified the domains necessary to assess potential biases in prognosis studies . We then used this information to assess how well the review s ' quality assessment included items from the domains necessary to assess potential biases . After completing each of the first 3 steps , the review ers met to attempt to reach a consensus . The consensus process involved each review er presenting his or her observations and results , followed by discussion and debate . A third review er was available in cases of persistent disagreement or uncertainty . In the first step , all domains addressed by the quality items were identified . The first review er iteratively and progressively defined the domains as items were extracted from the included review s. The second review er defined domains from a r and om list of all extracted quality items . Limited guidance was provided to the review ers so that their assessment s and definitions of domains would be independent . The review ers agreed on a final set of domains that adequately and completely defined all of the extracted items . In the second step , review ers independently grouped each extracted item into the appropriate domains . Review ers considered each extracted item by asking , What is each particular quality item addressing ? or What are the review 's authors getting at with the particular quality assessment item ? . Items were grouped into the domain or domains that best represented the concepts being addressed . For example , the extracted items at least 80 % of the group originally identified was located for follow-up and follow-up was sufficiently complete or does n't jeopardize validity were each independently classified by both review ers as assessing the domain completeness of follow-up adequate , whereas the extracted item quantification and description of all subjects lost to follow-up was classified as assessing the domain completeness of follow-up described . In the third step , we identified the domains necessary to assess potential biases . Each review er considered the ability of the identified domains to adequately address , at least in part , 1 of the following 6 potential biases : 1 ) study participation , 2 ) study attrition , 3 ) prognostic factor measurement , 4 ) confounding measurement and account , 5 ) outcome measurement , and 6 ) analysis . Domains were considered to adequately address part of the framework if information garnered from that domain would inform the assessment of potential bias . For example , both review ers judged that the identified domain study population represents source population or population of interest assessed potential bias in a prognosis study , whereas the domain research question definition did not , although the latter is an important consideration in assessing the inclusion of studies in a systematic review . Finally , on the basis of our previous ratings , we looked at whether each review included items from the domains necessary to assess the 6 potential biases . We calculated the frequency of systematic review s by assessing each potential bias and the number of review s that adequately assessed bias overall . From this systematic synthesis , we developed recommendations for improving quality appraisal in future systematic review s of prognosis studies . We used Microsoft Access and Excel 2002 ( Microsoft Corp. , Redmond , Washington ) for data management and SAS for Windows , version 9.1 ( SAS Institute , Inc. , Cary , North Carolina ) for descriptive statistics . Role of the Funding Sources The funding sources , the Canadian Institutes of Health Research , the Canadian Chiropractic Research Foundation , the Ontario Chiropractic Association , and the Ontario Ministry of Health and Long Term Care , did not have a role in the collection , analysis , or interpretation of the data or in the decision to su bmi t the manuscript for publication . Results We identified 1384 potentially relevant articles . Figure 1 shows a flow chart of studies that were included and excluded . Figure 2 shows the number of review s identified by year of publication . We excluded 131 systematic review s of prognosis studies that did not seem to include any quality assessment of the included studies ; this represented 44 % of prognosis review s. We included 163 review s of prognosis studies in our analysis ( 13 - 175 ) . The most common topics were cancer ( 15 % ) , musculoskeletal disorders and rheumatology ( 13 % ) , cardiovascular ( 10 % ) , neurology ( 10 % ) , and obstetrics ( 10 % ) . Other review s included a wide range of health and health care topics . Sixty-three percent of the review s investigated the association between a specific prognostic factor and a particular outcome ; the remainder investigated multiple prognostic factors or models . The number of primary studies included in each systematic review ranged from 3 to 167 ( median , 18 [ interquartile range , 12 to 31 ] ) . A complete description of the included review s is available from the authors on request . Figure 1 . Flow diagram of inclusion and exclusion criteria of systematic review s. Figure 2 . Number of systematic review s of prognosis studies identified over time . Quality Items One hundred fifty-three review s provided adequate detail to allow extraction of quality items . Eight hundred eighty-two distinct quality items were extracted from the review s. Most review s developed their own set of quality items , with only a few applying criteria from previous review s. Most quality items Part of a series of studies into early detection in adolescent suicide , this study investigated relationships between parenting style and suicidal thoughts , acts and depression . Students ( mean age 15 years ) from 4 r and omly chosen high schools completed self‐report question naires containing the Parental Bonding Instrument ( PBI ) and the Youth Self Report , which provided information about suicide ideation , deliberate self‐harm and depression . Significant differences for mean scores on the PBI subscales were noted between cases and noncases of depression , suicidal thoughts and deliberate self‐harm . Assignment by adolescents of their parents to the “ affectionless control ” quadrant of the PBI doubles the relative risk for suicidal thoughts , increases the relative risk for deliberate self‐harm 3‐fold and increases the relative risk for depression 5‐fold . It seems that the PBI may play a role in identification of vulnerable adolescents ; further , it both eluci date s aspects of adolescent‐parent interaction and points toward areas for intervention with at‐risk adolescents . We recommend the use of the PBI in early detection studies of adolescent suicide BACKGROUND Traditional and largely qualitative review s of evidence are now giving way to much more structured systematic overviews that use a quantitative method to calculate the overall effect of treatment . The latter approach is dependent on the quality of primary studies , which may introduce bias if they are of poor method ologic quality . OBJECTIVE To test the hypothesis that the inclusion of poor- quality trials in meta-analyses would bias the conclusions and produce incorrect estimates of treatment effect . METHODS An overview of r and omized trials of antiestrogen therapy in subfertile men with oligospermia was performed to test the hypothesis . Data sources included online search ing of MEDLINE and Science Citation Index data bases between 1966 and 1994 , scanning the bibliography of known primary studies and review articles , and contacting experts in the field . After independent , blind assessment , nine of 149 originally identified studies met the inclusion criteria and were selected . We assessed study quality independently . Outcome data from each study were pooled and statistically summarized . RESULTS There was a marginal improvement in pregnancy rate with antiestrogen treatment ( odds ratio , 1.6 ; 95 % confidence interval , 0.9 to 2.6 ) . Sensitivity analyses on the basis of method ologic quality demonstrated that poor- quality studies produced a positive effect with treatment , whereas no benefit was observed with high- quality studies . CONCLUSION The results of a meta- analysis are influenced by the quality of the primary studies included . Method ologically , poor studies tend to exaggerate the overall estimate of treatment effect and may lead to incorrect inferences OBJECTIVE To investigate the prospect i ve influence of individual adolescents ' sensation seeking tendency and the sensation seeking tendency of named peers on the use of alcohol and marijuana , controlling for a variety of interpersonal and attitudinal risk and protective factors . METHOD Data were collected from a cohort of adolescents ( N = 428 ; 60 % female ) at three points in time , starting in the eighth grade . Respondents provided information about sensation seeking , the positivity of family relations , attitudes toward alcohol and drug use , perceptions of their friends ' use of alcohol and marijuana , perceptions of influence by their friends to use alcohol and marijuana , and their own use of alcohol and marijuana . In addition , they named up to three peers , whose sensation seeking and use data were integrated with respondents ' data to allow for tests of hypotheses about peer clustering and substance use . RESULTS Structural equation modeling analyses revealed direct effects of peers ' sensation seeking on adolescents ' own use of both marijuana and alcohol 2 years later . An unexpected finding was that the individual 's own sensation seeking had indirect ( not direct ) effects on drug use 2 years later . CONCLUSIONS These findings indicate the potential importance of sensation seeking as a characteristic on which adolescent peers cluster . Furthermore , the findings indicate that , beyond the influence of a variety of other risk factors , peer sensation seeking contributes to adolescents ' substance use OBJECTIVE To identify which specific parenting behaviors are associated with the onset of alcohol and tobacco use and how they are associated . DESIGN Prospect i ve cohort study of two groups of preadolescents surveyed annually , the first group for 4 years , the second for 3 years . SETTING Two public school districts in Southern California . SUBJECTS 1034 fifth grade rs and 1266 seventh grade rs began the study after obtaining parental consent to complete surveys in a classroom setting . By the last measurement , attrition was 37 and 38 % for the two cohorts , respectively . MAIN OUTCOME MEASURES The onset of tobacco or alcohol use in the last month . RESULTS Children who reported that parents spent more time with them and communicated with them more frequently had lower onset rates of using alcohol and tobacco in the last month . These parental interactions lead to more positive relationships with their children . Parental monitoring and positive relations were protective factors for disruptive behavior and the selection of substance-using friends . Disruptive behavior increased the odds of adolescents drinking in the last month approximately twofold and of smoking in the last month two to fourfold . CONCLUSIONS This study provides further evidence that parenting behaviors are significant precursors to adolescent disruptive behavior , vulnerability to peer pressure , and subsequent substance use . Parents should be targeted in future substance use prevention programs , before their children reach adolescence OBJECTIVE Various studies reported associations between age at first drink ( AFD ) and later problem behaviors . However , the nature and strength of these associations are less clear . The present study investigates differences in the links between AFD and later alcohol use and the ones between alcohol use and later drinking problems , based on quality of parent-child relationships . METHOD Structural equation models were estimated based on a three-wave , 2-year prospect i ve study of 364 adolescents . RESULTS AFD measured at Time 1 was related to drinking quantity at Time 2 , which was related to alcohol-related problems at Time 3 . However , the significant links between AFD and later alcohol use existed solely among adolescents who reported high- quality relationships with their parents . More specifically , compared with all other adolescents , only those who had a late AFD and a high- quality relationship with their parents were found to have low levels of alcohol use at Time 2 and low levels of alcohol-related problems at Time 3 . CONCLUSIONS By promoting a high- quality relationship , prevention efforts might trigger a spiral of healthy developments during adolescence , including family well-being in early adolescence , late AFDs , lower alcohol-use levels , and eventually fewer alcohol-related problems in late adolescence This prospect i ve , longitudinal study investigated the moderating role of pubertal timing on reciprocal links between adolescent appraisal s of parent-child relationship quality and girls ' ( N=1335 ) and boys ' ( N=1203 ) cigarette and alcohol use across a twelve-month period . Reciprocal effects were found between parent-child relations and on-time maturing boys and girls ' cigarette and alcohol use , after estimating stability in these constructs across time . Parent-child relationship quality was associated with increased alcohol use twelve months later for early maturing girls . Cigarette and alcohol use were associated with increased problems in the parent-child relationship for late maturing girls . No off-time effects were observed for off-time maturing boys in the pathways between parent-child relationship quality and substance use . Pubertal timing moderated the pathway linking parent-child relationship quality with cigarette use one year later such that the association was stronger for late maturing girls compared to early and on-time maturing girls . The findings indicate interplay between the psychosocial aspects of maturation , family relationships and adolescent substance use and highlight possible gender-specific influences Self-reports on alcohol use collected via school-based question naires , telephone surveys , and household interviews are central measures in many studies in the alcohol field . The validity of such self-reports remains an issue . Use of biological pipeline procedures is one way in which the quality of self-reports might be improved . The current study tested the effectiveness of a saliva test pipeline procedure in increasing drinking disclosure rates among adolescents in the sixth and eighth grade s. Two sixth- grade classes from each of 14 elementary schools ( n = 828 ) and four eighth- grade classes from each of 8 middle schools ( n = 754 ) were selected . Half of the classes in each school were assigned to the pipeline condition and half to the control condition . Each student in the pipeline condition was asked to provide a saliva sample via dental roll before completing a question naire that all students ( pipeline and control ) received . Pipeline students were told that " some of the saliva we collect today will be tested in a laboratory and will provide a biological measure of alcohol use . " Sixth- and eighth- grade students exposed to the alcohol procedure reported 5 to 7 % higher alcohol use prevalences than students in the control group . While the pattern of improved reporting under the pipeline condition held across four alcohol-use measures and two grade levels , the effect was statistically significant for only one measure . The pipeline procedures used here had small effects on adolescent self-reported alcohol use
11,011
28,802,803
The apparent reductions in all-cause mortality and diseases of ageing associated with metformin use suggest that metformin could be extending life and healthspans by acting as a geroprotective agent
This systematic review investigated whether the insulin sensitiser metformin has a geroprotective effect in humans .
OBJECTIVE : To compare the risks of short-term and long-term use of an etonogestrel-containing and ethinylestradiol-containing vaginal ring and combined oral contraceptive pills ( OCPs ) in a routine clinical study population . METHODS : This was a prospect i ve , controlled , noninterventional cohort study performed in the United States and five European countries with the following two cohorts : new users of the vaginal ring and new users of combined OCPs ( starters , switchers , or restarters ) . The study population included 33,295 users of the vaginal ring or combined OCPs recruited by 1,661 study centers . Follow-up of study participants occurred for 2 to 4 years . Main clinical outcomes of interest were cardiovascular outcomes , particularly venous and arterial thromboembolism . These outcomes were vali date d by attending physicians and further adjudicated by an independent board . Comprehensive follow-up ensured low loss to follow-up . Statistical analyses were based on Cox regression models . Primary statistical variable was the venous thromboembolic hazard ratio ( HR ) for the vaginal ring compared with combined OCPs . RESULTS : Study participants were followed-up for 66,489 woman-years . Loss to follow-up was 2.9 % . The venous thromboembolism incidence rates for the vaginal ring users and combined OCPs users were 8.3 and 9.2 per 10,000 woman-years , respectively . Cox regression analysis yielded crude and adjusted HRs for the vaginal ring users compared with combined OCPs users of 0.9 and 0.8 for venous thromboembolism ( 95 % confidence intervals [ CIs ] 0.5–1.6 and 0.5–1.5 ) and 0.8 and 0.7 ( 95 % CIs 0.2–2.5 and 0.2–2.3 ) for arterial thromboembolism , respectively . CONCLUSION : Vaginal ring use and combined OCP use were associated with a similar venous and arterial thromboembolic risk during routine clinical use . LEVEL OF EVIDENCE : OBJECTIVE The antidiabetic properties of metformin are mediated through its ability to activate the AMP-activated protein kinase ( AMPK ) . Activation of AMPK can suppress tumor formation and inhibit cell growth in addition to lowering blood glucose levels . We tested the hypothesis that metformin reduces the risk of cancer in people with type 2 diabetes . RESEARCH DESIGN AND METHODS In an observational cohort study using record-linkage data bases and based in Tayside , Scotl and , U.K. , we identified people with type 2 diabetes who were new users of metformin in 1994–2003 . We also identified a set of diabetic comparators , individually matched to the metformin users by year of diabetes diagnosis , who had never used metformin . In a survival analysis we calculated hazard ratios for diagnosis of cancer , adjusted for baseline characteristics of the two groups using Cox regression . RESULTS Cancer was diagnosed among 7.3 % of 4,085 metformin users compared with 11.6 % of 4,085 comparators , with median times to cancer of 3.5 and 2.6 years , respectively ( P < 0.001 ) . The unadjusted hazard ratio ( 95 % CI ) for cancer was 0.46 ( 0.40–0.53 ) . After adjusting for sex , age , BMI , A1C , deprivation , smoking , and other drug use , there was still a significantly reduced risk of cancer associated with metformin : 0.63 ( 0.53–0.75 ) . CONCLUSIONS These results suggest that metformin use may be associated with a reduced risk of cancer . A r and omized trial is needed to assess whether metformin is protective in a population at high risk for cancer There is growing evidence that patients with type 2 diabetes mellitus have increased cancer risk . We examined the association between diabetes , cancer , and cancer-related mortality and hypothesized that insulin sensitizers lower cancer-related mortality . Participants in the Diabetes Cardiovascular Risk and Evaluation : Targets and Essential Data for Commitment of Treatment study , a nationwide cross-sectional and prospect i ve epidemiological study , were recruited from German primary care practice s. In the cross-sectional study , subjects with type 2 diabetes mellitus had a higher prevalence of malignancies ( 66/1308 , 5.1 % ) compared to nondiabetic subjects ( 185/6211 , 3.0 % ) ( odds ratio , 1.64 ; 95 % confidence interval , 1.12 - 2.41 ) before and after adjustment for age , sex , hemoglobin A(1c ) , smoking status , and body mass index . Patients on metformin had a lower prevalence of malignancies , comparable with that among nondiabetic patients , whereas those on any other oral combination treatment had a 2-fold higher risk for malignancies even after adjusting for possible confounders ; inclusion of metformin in these regimens decreased the prevalence of malignancies . In the prospect i ve analyses , diabetic patients in general and diabetic patients treated with insulin ( either as monotherapy or in combination with other treatments ) had a 2- and 4-fold , respectively , higher mortality rate than nondiabetic patients , even after adjustment for potential confounders ( incidence of cancer deaths in patients with type 2 diabetes mellitus [ 2.6 % ] vs the incidence of cancer deaths in patients without type 2 diabetes mellitus [ 1.2 % ] ) . Our results suggest that diabetes and medications for diabetes , with the exception of the insulin sensitizer metformin , increase cancer risk and mortality Background : Experimental studies have suggested that metformin may decrease the incidence of colorectal cancer in patients with type II diabetes . However , previous observational studies have reported contradictory results , which are likely due to important method ologic limitations . Thus , the objective of this study was to assess whether the use of metformin is associated with the incidence of colorectal cancer in patients with type II diabetes . Methods : A cohort study of patients newly treated with non-insulin antidiabetic agents was assembled using the United Kingdom Clinical Practice Research Data link . A nested case – control analysis was conducted , where all incident cases of colorectal cancer occurring during follow-up were identified and r and omly matched with up to 10 controls . Conditional logistic regression was used to estimate adjusted rate ratios ( RR ) of colorectal cancer associated with ever use , and cumulative duration of use of metformin . All models accounted for latency and were adjusted for relevant potential confounding factors . Results : Overall , ever use of metformin was not associated with the incidence of colorectal cancer [ RR : 0.93 ; 95 % confidence interval ( CI ) , 0.73–1.18 ] . Similarly , no dose – response relationship was observed in terms of cumulative duration of use . Conclusions : The use of metformin was not associated with the incidence of colorectal cancer in patients with type II diabetes . Impact : The results of this study do not support the launch of metformin r and omized controlled trials for the chemoprevention of colorectal cancer . Cancer Epidemiol Biomarkers Prev ; 22(10 ) ; 1877–83 . © 2013 AACR Aims /hypothesis Studies on the link between diabetes and bladder cancer in Asians are rare . We investigated the association between diabetes and incidence of bladder cancer by using a large national insurance data base . Methods A r and om sample of 1,000,000 individuals covered by the National Health Insurance was recruited . A total of 495,199 men and 503,748 women for all ages and 187,609 men and 189,762 women ≥40 years old and without bladder cancer at recruitment were followed from 2003 to 2005 . Cox regression evaluated the adjusted relative risk for all ages and for age ≥40 years old . Results The results were similar for all ages and for age ≥40 years . In Cox models , patients with diabetes consistently showed a significantly higher relative risk ranging from 1.36 to 1.51 after adjustment for age , sex and other potential confounders . Age , male sex , nephropathy , urinary tract diseases ( infection and stone ) and statin use were associated with bladder cancer , but occupation , hypertension , stroke , ischaemic heart disease , peripheral arterial disease , eye disease , dyslipidaemia and medications ( oral glucose-lowering agents including sulfonylurea , metformin , acarbose and thiazolidinediones , insulin , fibrates , ACE inhibitors/angiotensin receptor blockers and calcium channel blockers ) were not . Chronic obstructive pulmonary disease and living in regions other than Metropolitan Taipei were associated with lower risk . Conclusions Patients with diabetes have a higher risk of bladder cancer . The association with urinary tract diseases suggests a complex scenario in the link between bladder cancer and diabetes at different disease stages INTRODUCTION The prevalence of type 2 diabetes in Thail and is 9.8 percent which is double the number forecast by World Health Organization . There is inadequate information to statistically represent all Thai diabetic patients for their causes of death . OBJECTIVE To determine the clinical characteristics that predicted death and causes of death in Thai diabetic patients . MATERIAL AND METHOD This prospect i ve cohort was a 3-year follow-up study of the Thai Diabetes Registry project done between April , 2003 , and February , 2006 , which registered 9,419 diabetic patients attending 11 diabetic clinics in tertiary medical centers in Bangkok and major provinces of Thail and . The dead or alive status ( 99.5 % ) was determined . The causes of death were defined by review ing the medical records . RESULTS Of the 9,370 diabetic patients registered , 425 patients died , 1.84 percent per year . There was an increased risk of death associated with age , type of healthcare plan , lower education , insulin use , smoking , history of coronary artery disease and cerebrovascular disease , serum creatinine and high HbA1c . Lipid-lowering medication and metformin decreased the risk of death . Cardiovascular disease , infection and cancer were the prevalent causes of death . CONCLUSION The present study showed risk factors that influenced death and causes of death in Thai diabetics BACKGROUND In the 2.8 years of the Diabetes Prevention Program ( DPP ) r and omised clinical trial , diabetes incidence in high-risk adults was reduced by 58 % with intensive lifestyle intervention and by 31 % with metformin , compared with placebo . We investigated the persistence of these effects in the long term . METHODS All active DPP participants were eligible for continued follow-up . 2766 of 3150 ( 88 % ) enrolled for a median additional follow-up of 5.7 years ( IQR 5.5 - 5.8 ) . 910 participants were from the lifestyle , 924 from the metformin , and 932 were from the original placebo groups . On the basis of the benefits from the intensive lifestyle intervention in the DPP , all three groups were offered group-implemented lifestyle intervention . Metformin treatment was continued in the original metformin group ( 850 mg twice daily as tolerated ) , with participants unmasked to assignment , and the original lifestyle intervention group was offered additional lifestyle support . The primary outcome was development of diabetes according to American Diabetes Association criteria . Analysis was by intention-to-treat . This study is registered with Clinical Trials.gov , number NCT00038727 . FINDINGS During the 10.0-year ( IQR 9.0 - 10.5 ) follow-up since r and omisation to DPP , the original lifestyle group lost , then partly regained weight . The modest weight loss with metformin was maintained . Diabetes incidence rates during the DPP were 4.8 cases per 100 person-years ( 95 % CI 4.1 - 5.7 ) in the intensive lifestyle intervention group , 7.8 ( 6.8 - 8.8 ) in the metformin group , and 11.0 ( 9.8 - 12.3 ) in the placebo group . Diabetes incidence rates in this follow-up study were similar between treatment groups : 5.9 per 100 person-years ( 5.1 - 6.8 ) for lifestyle , 4.9 ( 4.2 - 5.7 ) for metformin , and 5.6 ( 4.8 - 6.5 ) for placebo . Diabetes incidence in the 10 years since DPP r and omisation was reduced by 34 % ( 24 - 42 ) in the lifestyle group and 18 % ( 7 - 28 ) in the metformin group compared with placebo . INTERPRETATION During follow-up after DPP , incidences in the former placebo and metformin groups fell to equal those in the former lifestyle group , but the cumulative incidence of diabetes remained lowest in the lifestyle group . Prevention or delay of diabetes with lifestyle intervention or metformin can persist for at least 10 years . FUNDING National Institute of Diabetes and Digestive and Kidney Diseases ( NIDDK ) OBJECTIVE To assess the association of hypoglycemic treatment regimens with cardiovascular adverse events and mortality in a large population of type 2 diabetic patients at increased cardiovascular risk . RESEARCH DESIGN AND METHODS This analysis included 8,192 overweight patients with type 2 diabetes from the Sibutramine Cardiovascular Outcomes ( SCOUT ) trial r and omized to lifestyle intervention with or without sibutramine for up to 6 years . Patients were grouped according to hypoglycemic treatment at baseline . The primary end point was the time from r and omization to the first occurrence of a primary outcome event ( POE ) , nonfatal myocardial infa rct ion , nonfatal stroke , resuscitation after cardiac arrest , or cardiovascular death . Multivariable Cox proportional hazards regression models were used to assess the impact of antiglycemic treatment on POE and all-cause mortality . RESULTS Treatments for type 2 diabetes were as follows : diet alone ( n = 1,394 subjects ) , metformin monotherapy ( n = 1,631 ) , insulin monotherapy ( n = 1,116 ) , sulfonylurea monotherapy ( n = 1,083 ) , metformin plus sulfonylurea ( n = 1,565 ) , and metformin plus insulin ( n = 1,000 ) ; 905 subjects experienced a POE and 708 died . Metformin monotherapy was associated with lower risk of POE than insulin ( hazard ratio [ HR ] , 0.74 ; 95 % CI , 0.57–0.95 ; P = 0.02 ) . Diet alone also was associated with lower risk of POE ( HR , 0.65 ; 95 % CI , 0.48–0.87 ; P = 0.004 ) . Metformin monotherapy also was associated with lower mortality ( HR , 0.73 ; 95 % CI , 0.54–0.99 ; P < 0.05 ) , whereas no other monotherapies or combination therapies were significantly associated with POE or all-cause mortality compared with insulin as monotherapy . CONCLUSIONS In obese patients with type 2 diabetes and high risk of cardiovascular disease , monotherapy with metformin or diet-only treatment was associated with lower risk of cardiovascular events than treatment with insulin Objective This study evaluated thyroid cancer risk with regards to diabetes status and diabetes duration , and with the use of anti-diabetic drugs including sulfonylurea , metformin , insulin , acarbose , pioglitazone and rosiglitazone , by using a population -based reimbursement data base in Taiwan . Methods A r and om sample of 1,000,000 subjects covered by the National Health Insurance was recruited . After excluding patients with type 1 diabetes , 999730 subjects ( 495673 men and 504057 women ) were recruited into the analyses . Logistic regression estimated the odds ratios ( OR ) and their 95 % confidence intervals ( CI ) for independent variables including age , sex , diabetes status/ duration , anti-diabetic drugs , other medications , comorbidities , living regions , occupation and examinations that might potentially lead to the diagnosis of thyroid cancer in various models . Results The diabetic patients had a significantly higher probability of receiving potential detection examinations ( 6.38 % vs. 5.83 % , P<0.0001 ) . After multivariable-adjustment , the OR ( 95 % CI ) for diabetes status was 0.816 ( 0.652–1.021 ) ; and for diabetes duration < 1 year , 1–3 years , 3–5 years and ≥5 years vs. non-diabetes was 0.071 ( 0.010–0.507 ) , 0.450 ( 0.250–0.813 ) , 0.374 ( 0.203–0.689 ) and 1.159 ( 0.914–1.470 ) , respectively . Among the anti-diabetic agents , only sulfonylurea was significantly associated with thyroid cancer , OR ( 95 % CI ) : 1.882 ( 1.202–2.947 ) . The OR ( 95 % CI ) for insulin , metformin , acarbose , pioglitazone and rosiglitazone was 1.701 ( 0.860–3.364 ) , 0.696 ( 0.419–1.155 ) , 0.581 ( 0.202–1.674 ) , 0.522 ( 0.069–3.926 ) and 0.669 ( 0.230–1.948 ) , respectively . Furthermore , patients with benign thyroid disease or other cancer , living in Kao-Ping/Eastern regions , or receiving potential detection examinations might have a significantly higher risk ; and male sex , hypertension , dyslipidemia , chronic obstructive pulmonary disease , vascular complications or use of statin , aspirin or non-steroidal anti-inflammatory drugs might be associated with a significantly lower risk . Conclusions There is a lack of an overall association between diabetes and thyroid cancer , but patients with diabetes duration < 5 years have a significantly lower risk . Sulfonylurea may increase the risk of thyroid cancer OBJECTIVE —The purpose of this study was to examine possible factors associated with the increased risk of fractures observed with rosiglitazone in A Diabetes Outcome Progression Trial ( ADOPT ) . RESEARCH DESIGN AND METHODS — Data from the 1,840 women and 2,511 men r and omly assigned in ADOPT to rosiglitazone , metformin , or glyburide for a median of 4.0 years were examined with respect to time to first fracture , rates of occurrence , and sites of fractures . RESULTS —In men , fracture rates did not differ between treatment groups . In women , at least one fracture was reported with rosiglitazone in 60 patients ( 9.3 % of patients , 2.74 per 100 patient-years ) , metformin in 30 patients ( 5.1 % , 1.54 per 100 patient-years ) , and glyburide in 21 patients ( 3.5 % , 1.29 per 100 patient-years ) . The cumulative incidence ( 95 % CI ) of fractures in women at 5 years was 15.1 % ( 11.2–19.1 ) with rosiglitazone , 7.3 % ( 4.4–10.1 ) with metformin , and 7.7 % ( 3.7–11.7 ) with glyburide , representing hazard ratios ( 95 % CI ) of 1.81 ( 1.17–2.80 ) and 2.13 ( 1.30–3.51 ) for rosiglitazone compared with metformin and glyburide , respectively . The increase in fractures with rosiglitazone occurred in pre- and postmenopausal women , and fractures were seen predominantly in the lower and upper limbs . No particular risk factor underlying the increased fractures in female patients who received rosiglitazone therapy was identified . CONCLUSIONS —Further investigation into the risk factors and underlying pathophysiology for the increased fracture rate in women taking rosiglitazone is required to relate them to pre clinical data and better underst and the clinical implication s of and possible interventions for these findings IMPORTANCE Caloric restriction mimetic drugs have geroprotective effects that delay or reduce risks for a variety of age-associated systemic diseases , suggesting that such drugs might also have the potential to reduce risks of blinding ophthalmologic conditions for which age is a major risk factor . OBJECTIVE To determine whether the caloric restriction mimetic drug metformin hydrochloride is associated with reduced risk of open-angle glaucoma ( OAG ) in persons with diabetes mellitus . DESIGN , SETTING , AND PATIENTS Retrospective cohort study of patients aged 40 years or older with diabetes mellitus and no preexisting record of OAG in a large US managed care network from January 1 , 2001 , through December 31 , 2010 . EXPOSURES Quantity of metformin and other prescribed diabetes medications as captured from outpatient pharmacy records . MAIN OUTCOMES AND MEASURES Risk of developing OAG . RESULTS Of 150 016 patients with diabetes mellitus , 5893 ( 3.9 % ) developed OAG . After adjusting for confounding factors , those prescribed the highest quartile of metformin hydrochloride ( > 1110 g in 2 years ) had a 25 % reduced OAG risk relative to those who took no metformin ( hazard ratio = 0.75 ; 95 % CI , 0.59 - 0.95 ; P = .02 ) . Every 1-g increase in metformin hydrochloride use was associated with a 0.16 % reduction in OAG risk ( adjusted hazard ratio = 0.99984 ; 95 % CI , 0.99969 - 0.99999 ; P = .04 ) , which predicts that taking a st and ard dose of 2 g of metformin hydrochloride per day for 2 years would result in a 20.8 % reduction in risk of OAG . After accounting for potential confounders , including metformin and diabetic medications , the risk of developing OAG was increased by 8 % ( hazard ratio = 1.08 ; 95 % CI , 1.03 - 1.13 ; P = .003 ) for each unit of increase in glycated hemoglobin level . CONCLUSIONS AND RELEVANCE Metformin use is associated with reduction in risk of developing OAG , and risk is reduced even when accounting for glycemic control in the form of glycated hemoglobin level . Other diabetes medications did not confer a similar OAG risk reduction . This study suggests that metformin may be affecting OAG risk on multiple levels , some involving improved glycemic control and some involving mechanisms outside glycemic control such as neurogenesis , inflammatory systems , or longevity pathways targeted by caloric restriction mimetic drugs . If confirmed by prospect i ve clinical trials , these findings could lead to novel treatments for this sight-threatening disease
11,012
30,362,897
Conclusion : The use of conceptual models underpinning the development of advance care planning is uncommon . When used , they identify the individual behavioural change . Strengthening patients ’ motivation and competence in participating advance care planning discussion s are key mechanisms of change .
Background : No systematic review has focused on conceptual models underpinning advance care planning for patients with advanced cancer , and the mechanisms of action in relation to the intended outcomes . Aim : To appraise conceptual models and develop a logic model of advance care planning for advanced cancer patients , examining the components , processes , theoretical underpinning , mechanisms of action and linkage with intended outcomes .
OBJECTIVE Little is known about the effectiveness of advance care planning in the United Kingdom , although policy documents recommend that it should be available to all those with life-limiting illness . METHOD An exploratory patient preference r and omized controlled trial of advance care planning discussion s with an independent mediator ( maximum three sessions ) was conducted in London outpatient oncology clinics and a nearby hospice . Seventy-seven patients ( mean age 62 years , 39 male ) with various forms of recurrent progressive cancer participated , and 68 ( 88 % ) completed follow-up at 8 weeks . Patients completed visual analogue scales assessing perceived ability to discuss end-of-life planning with healthcare professionals or family and friends ( primary outcome ) , happiness with the level of communication , and satisfaction with care , as well as a st and ardized measure of anxiety and depression . RESULTS Thirty-eight patients ( 51 % ) showed preference for the intervention . Discussion s with professionals or family and friends about the future increased in the intervention arms , whether r and omized or preference , but happiness with communication was unchanged or worse , and satisfaction with services decreased . Trial participation did not cause significant anxiety or depression and attrition was low . SIGNIFICANCE OF RESULTS A r and omized trial of advance care planning is possible . This study provides new evidence on its acceptability and effectiveness for patients with advanced cancer PURPOSE Decision making regarding cardiopulmonary resuscitation ( CPR ) is challenging . This study examined the effect of a video decision support tool on CPR preferences among patients with advanced cancer . PATIENTS AND METHODS We performed a r and omized controlled trial of 150 patients with advanced cancer from four oncology centers . Participants in the control arm ( n = 80 ) listened to a verbal narrative describing CPR and the likelihood of successful resuscitation . Participants in the intervention arm ( n = 70 ) listened to the identical narrative and viewed a 3-minute video depicting a patient on a ventilator and CPR being performed on a simulated patient . The primary outcome was participants ' preference for or against CPR measured immediately after exposure to either modality . Secondary outcomes were participants ' knowledge of CPR ( score range of 0 to 4 , with higher score indicating more knowledge ) and comfort with video . RESULTS The mean age of participants was 62 years ( st and ard deviation , 11 years ) ; 49 % were women , 44 % were African American or Latino , and 47 % had lung or colon cancer . After the verbal narrative , in the control arm , 38 participants ( 48 % ) wanted CPR , 41 ( 51 % ) wanted no CPR , and one ( 1 % ) was uncertain . In contrast , in the intervention arm , 14 participants ( 20 % ) wanted CPR , 55 ( 79 % ) wanted no CPR , and 1 ( 1 % ) was uncertain ( unadjusted odds ratio , 3.5 ; 95 % CI , 1.7 to 7.2 ; P < .001 ) . Mean knowledge scores were higher in the intervention arm than in the control arm ( 3.3 ± 1.0 v 2.6 ± 1.3 , respectively ; P < .001 ) , and 65 participants ( 93 % ) in the intervention arm were comfortable watching the video . CONCLUSION Participants with advanced cancer who viewed a video of CPR were less likely to opt for CPR than those who listened to a verbal narrative Background The Medical Research Councils ’ framework for complex interventions has been criticized for not including theory-driven approaches to evaluation . Although the framework does include broad guidance on the use of theory , it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach . A prospect i ve , theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective , sustainable and scalable . Methods We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool , Theory of Change ( ToC ) , into the MRC framework for complex interventions . We provide a guide to what ToC is , how to construct one , and how to integrate its use into research projects seeking to design , implement and evaluate complex interventions using the MRC framework . We test this approach by using ToC within two r and omized controlled trials and one non-r and omized evaluation of complex interventions . Results Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework . It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly design ing an intervention that is embedded in the local context . For the feasibility and piloting stage , ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design . ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact , combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework . Conclusions Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex interventions , thereby increasing the likelihood that the intervention will be ultimately effective , sustainable and scalable . We urge research ers developing and evaluating complex interventions to consider using this approach , to evaluate its usefulness and to build an evidence base to further refine the methodology .Trial registration Clinical trials.gov : PURPOSE To determine whether the use of a goals -of-care video to supplement a verbal description can improve end-of-life decision making for patients with cancer . METHODS Fifty participants with malignant glioma were r and omly assigned to either a verbal narrative of goals -of-care options at the end of life ( control ) , or a video after the same verbal narrative ( intervention ) in this r and omized controlled trial . The video depicts three levels of medical care : life-prolonging care ( cardiopulmonary resuscitation [ CPR ] , ventilation ) , basic care ( hospitalization , no CPR ) , and comfort care ( symptom relief ) . The primary study outcome was participants ' preferences for end-of-life care . The secondary outcome was participants ' uncertainty regarding decision making ( score range , 3 to 15 ; higher score indicating less uncertainty ) . Participants ' comfort level with the video was also measured . RESULTS Fifty participants were r and omly assigned to either the verbal narrative ( n = 27 ) or video ( n = 23 ) . After the verbal description , 25.9 % of participants preferred life-prolonging care , 51.9 % basic care , and 22.2 % comfort care . In the video arm , no participants preferred life-prolonging care , 4.4 % preferred basic care , 91.3 % preferred comfort care , and 4.4 % were uncertain ( P < .0001 ) . The mean uncertainty score was higher in the video group than in the verbal group ( 13.7 v 11.5 , respectively ; P < .002 ) . In the intervention arm , 82.6 % of participants reported being very comfortable watching the video . CONCLUSION Compared with participants who only heard a verbal description , participants who viewed a goals -of-care video were more likely to prefer comfort care and avoid CPR , and were more certain of their end-of-life decision making . Participants reported feeling comfortable watching the video Objective To investigate the impact of advance care planning on end of life care in elderly patients . Design Prospect i ve r and omised controlled trial . Setting Single centre study in a university hospital in Melbourne , Australia . Participants 309 legally competent medical in patients aged 80 or more and followed for six months or until death . Interventions Participants were r and omised to receive usual care or usual care plus facilitated advance care planning . Advance care planning aim ed to assist patients to reflect on their goals , values , and beliefs ; to consider future medical treatment preferences ; to appoint a surrogate ; and to document their wishes . Main outcome measures The primary outcome was whether a patient ’s end of life wishes were known and respected . Other outcomes included patient and family satisfaction with hospital stay and levels of stress , anxiety , and depression in relatives of patients who died . Results 154 of the 309 patients were r and omised to advance care planning , 125 ( 81 % ) received advance care planning , and 108 ( 84 % ) expressed wishes or appointed a surrogate , or both . Of the 56 patients who died by six months , end of life wishes were much more likely to be known and followed in the intervention group ( 25/29 , 86 % ) compared with the control group ( 8/27 , 30 % ; P<0.001 ) . In the intervention group , family members of patients who died had significantly less stress ( intervention 5 , control 15 ; P<0.001 ) , anxiety ( intervention 0 , control 3 ; P=0.02 ) , and depression ( intervention 0 , control 5 ; P=0.002 ) than those of the control patients . Patient and family satisfaction was higher in the intervention group . Conclusions Advance care planning improves end of life care and patient and family satisfaction and reduces stress , anxiety , and depression in surviving relatives . Trial registration Australian New Zeal and clinical trials registry ACTRN12608000539336 BACKGROUND Cardiopulmonary resuscitation ( CPR ) is an important advance directive ( AD ) topic in patients with progressive cancer ; however such discussion s are challenging . OBJECTIVE This study investigates whether video educational information about CPR engenders broader advance care planning ( ACP ) discourse . METHODS Patients with progressive pancreas or hepatobiliary cancer were r and omized to an educational CPR video or a similar CPR narrative . The primary end-point was the difference in ACP documentation one month posttest between arms . Secondary end-points included study impressions ; pre- and post-intervention knowledge of and preferences for CPR and mechanical ventilation ; and longitudinal patient outcomes . RESULTS Fifty-six subjects were consented and analyzed . Rates of ACP documentation ( either formal ADs or documented discussion s ) were 40 % in the video arm ( 12/30 ) compared to 15 % in the narrative arm ( 4/26 ) , OR=3.6 [ 95 % CI : 0.9 - 18.0 ] , p=0.07 . Post-intervention knowledge was higher in both arms . Posttest , preferences for CPR had changed in the video arm but not in the narrative arm . Preferences regarding mechanical ventilation did not change in either arm . The majority of subjects in both arms reported the information as helpful and comfortable to discuss , and they recommended it to others . More deaths occurred in the video arm compared to the narrative arm , and more subjects died in hospice setting s in the video arm . CONCLUSIONS This pilot r and omized trial addressing downstream ACP effects of video versus narrative decision tools demonstrated a trend towards more ACP documentation in video subjects . This trend , as well as other video effects , is the subject of ongoing study BACKGROUND Advance Care Planning ( ACP ) is a systematic approach to ensure that effective advance directives ( ADs ) are developed and respected . We studied the effects of implementing a regional ACP program in Germany . METHODS In a prospect i ve , inter-regionally controlled trial focusing on nursing homes ( n/hs ) , we compared the number , relevance and validity of new ADs completed in the intervention region versus the control region . Intervention n/h residents and their families were offered professional facilitation including st and ardized documentation . RESULTS Data from 136 residents of three intervention n/hs were compared with data from 439 residents of 10 control n/hs over a study period of 16.5 months . In the intervention region , 49 ( 36.0 % ) participating residents completed a new AD over the period of the study , compared to 18 ( 4.1 % ) in the control region ; these ADs included 30 ADs by proxy in the intervention region versus 10 in the control region . Proxies were design ated in 94.7 % versus 50.0 % of cases , the AD was signed by a physician in 93.9 % versus 16.7 % , and an emergency order was included in 98.0 % versus 44.4 % . Resuscitation status was addressed in 95.9 % versus 38.9 % of cases ( p<0.01 for all of the differences mentioned above ) . In the intervention region , new ADs were preceded by an average of 2.5 facilitated conversations ( range , 2–5 ) with a mean total duration of 100 minutes ( range , 60–240 minutes ) . CONCLUSION The implementation of an ACP program in German nursing homes led , much more frequently than previously reported , to the creation of advance directives with potential relevance to medical decision-making . Future research should assess the effect of such programs on clinical and structural outcomes Purpose To build on results of a cluster r and omized controlled trial ( RCT ) of a combined patient-oncologist intervention to improve communication in advanced cancer , we conducted a post hoc analysis of the patient intervention component , a previsit patient coaching session that used a question prompt list ( QPL ) . We hypothesized that intervention-group participants would bring up more QPL-related topics , particularly prognosis-related topics , during the subsequent oncologist visit . Patients and Methods This cluster RCT with 170 patients who had advanced nonhematologic cancer ( and their caregivers ) recruited from practice s of 24 participating oncologists in western New York . Intervention-group oncologists ( n = 12 ) received individualized communication training ; up to 10 of their patients ( n = 84 ) received a previsit individualized communication coaching session that incorporated a QPL . Control-group oncologists ( n = 12 ) and patients ( n = 86 ) received no interventions . Topics of interest identified by patients during the coaching session were summarized from coaching notes ; one office visit after the coaching session was audio recorded , transcribed , and analyzed by using linear regression modeling for group differences . Results Compared with controls , more than twice as many intervention-group participants brought up QPL-related topics during their office visits ( 70.2 % v 32.6 % ; P < .001 ) . Patients in the intervention group were nearly three times more likely to ask about prognosis ( 16.7 % v 5.8 % ; P = .03 ) . Of 262 topics of interest identified during coaching , 158 ( 60.3 % ) were QPL related ; 20 ( 12.7 % ) addressed prognosis . Overall , patients in the intervention group brought up 82.4 % of topics of interest during the office visit . Conclusion A combined coaching and QPL intervention was effective to help patients with advanced cancer and their caregivers identify and bring up topics of concern , including prognosis , during their subsequent oncologist visits . Considering that most patients are misinformed about prognosis , more intensive steps are needed to better promote such discussion BACKGROUND The use of advance directives is recommended so that people can determine the medical care they will receive when they are no longer competent , but the effectiveness of such directives is not clear . METHODS In a prospect i ve study conducted over a two-year period , 126 competent residents of a nursing home and 49 family members of incompetent patients were interviewed to determine their preferences with respect to hospitalization , intensive care , cardiopulmonary resuscitation , artificial ventilation , surgery , and tube feeding in the event of critical illness , terminal illness , or permanent unconsciousness . Advance directives , consisting of signed statements of treatment preferences , were placed in the medical record to assist in care in the nursing home and to be forwarded to the hospital if necessary . RESULTS In an analysis of 96 outcome events ( hospitalization or death in the nursing home ) , care was consistent with previously expressed wishes 75 percent of the time ; however , the presence of the written advance directive in the medical record did not facilitate consistency . Among the 24 events in which inconsistencies occurred , care was provided more aggressively than had been requested in 6 cases , largely because of unanticipated surgery or artificial ventilation , and less aggressively than requested in 18 , largely because hospitalization or cardiopulmonary resuscitation was withheld . Inconsistencies were more likely in the nursing home than in the hospital . CONCLUSIONS . The effectiveness of written advance directives is limited by inattention to them and by decisions to place priority on considerations other than the patient 's autonomy . Since our study was performed in only one nursing home and one hospital , other studies are necessary to determine the generalizability of our findings PURPOSE To determine whether provision of a question prompt list ( QPL ) influences advanced cancer patients ' /caregivers ' questions and discussion of topics relevant to end-of-life care during consultations with a palliative care ( PC ) physician . PATIENTS AND METHODS This r and omized controlled trial included patients r and omly assigned to st and ard consultation or provision of QPL before consultation , with endorsement of the QPL by the physician during the consultation . Consecutive eligible patients with advanced cancer referred to 15 PC physicians from nine Australian PC services were invited to participate . Consultations were audiotaped , transcribed , and analyzed by blinded coders ; patients completed question naires before , within 24 hours , and 3 weeks after the consultation . RESULTS A total of 174 patients participated ( 92 QPL , 82 control ) . Compared with controls , QPL patients and caregivers asked twice as many questions ( for patients , ratio , 2.3 ; 95 % CI , 1.7 to 3.2 ; P < .0001 ) , and patients discussed 23 % more issues covered by the QPL ( 95 % CI , 11 % to 37 % ; P < .0001 ) . QPL patients asked more prognostic questions ( ratio , 2.3 ; 95 % CI , 1.3 to 4.0 ; P = .004 ) and discussed more prognostic ( ratio , 1.43 ; 95 % CI , 1.1 to 1.8 , P = .003 ) and end-of-life issues ( 30 % v 10 % ; P = .001 ) . Fewer QPL patients had unmet information needs about the future ( 21 = 4.14 ; P = .04 ) , which was the area of greatest unmet information need . QPL consultations ( average , 38 minutes ) were longer ( P = .002 ) than controls ( average , 31 minutes ) . No differences between groups were observed in anxiety or patient/physician satisfaction . CONCLUSION Providing a QPL and physician endorsement of its use assists terminally ill cancer patients and their caregivers to ask questions and promotes discussion about prognosis and end-of-life issues , without creating patient anxiety or impairing satisfaction PURPOSE This study tested the efficacy of an intervention on end-of-life decision making for patients with advanced cancer . PATIENTS AND METHODS One hundred twenty patients with metastatic cancer who were no longer being treated with curative intent ( and 87 caregivers ) were r and omly assigned to the intervention ( n = 55 ) or treatment as usual ( n = 65 ) . Primary outcome measures were the proportion of patients with do-not-resuscitate ( DNR ) orders , timing of DNR orders , and place of death . Secondary outcome measures were completed at study enrollment , 3 weeks later , and 3 months later , including patients ' knowledge , mood , and caregiver burden . RESULTS High , but equivalent , rates of DNR orders were observed in both groups . In per- protocol analyses , DNR orders were placed earlier for patients who received the intervention ( median , 27 v 12.5 days ; 95 % CI , 1.1 to 5.9 ; P = .03 ) and they were more likely to avoid a hospital death ( 19 % v 50 % ( 95 % CI , 11 % to 50 % ; P = .004 ) . Differences between the groups over time were evident for estimates of cardiopulmonary rehabilitation ( CPR ) success rates ( P = .01 ) but not knowledge of CPR ( P = .2 ) . There was no evidence that the intervention result ed in more anxious or depressive symptoms . Caregivers experienced less burden in terms of disruption to schedule if the patient received the intervention ( P = .05 ) . CONCLUSION An intervention , consisting of an informational pamphlet and discussion , was associated with earlier placement of DNR orders relative to death and less likelihood of death in hospital . There was no negative impact of the intervention on secondary outcomes , although the sample may have been too small to detect differences BACKGROUND Patients are often not given the information needed to underst and their prognosis and make informed treatment choices , with many consequently experiencing less than optimal care and quality -of-life at end-of-life . OBJECTIVES To evaluate the efficacy of a nurse-facilitated communication support program for patients with advanced , incurable cancer to assist them in discussing prognosis and end-of-life care . DESIGN A parallel-group r and omised controlled trial design was used . SETTING S This trial was conducted at six cancer treatment centres affiliated with major hospitals in Sydney , Australia . PARTICIPANTS 110 patients with advanced , incurable cancer participated . METHODS The communication support program included guided exploration of a question prompt list , communication challenges , patient values and concerns and the value of discussing end-of-life care early , with oncologists cued to endorse question -asking and question prompt list use . Patients were r and omised after baseline measure completion , a regular oncology consultation was audio-recorded and a follow-up question naire was completed one month later . Communication , health-related quality -of-life and satisfaction measures and a manualised consultation-coding scheme were used . Descriptive , Mixed Modelling and Generalised Linear Mixed Modelling analyses were conducted using SPSS version 22 . RESULTS Communication support program recipients gave significantly more cues for discussion of prognosis , end-of-life care , future care options and general issues not targeted by the intervention during recorded consultations , but did not ask more questions about these issues or overall . Oncologists ' question prompt list and question asking endorsement was inconsistent . Communication support program recipients ' self-efficacy in knowing what questions to ask their doctor significantly improved at follow-up while control arm patients ' self-efficacy declined . The communication support program did not impact patients ' health-related quality -of-life or the likelihood that their health information or shared decision-making preferences would be met . Satisfaction with the communication support program was high . CONCLUSIONS Given the importance of clarifying prognostic expectations and end-of-life care wishes in the advanced cancer context , the communication support program appears to be an effective and well-received solution to encourage early information seeking related to these issues though , its long-term impact remains unclear . The manualised nature of the intervention , design ed with existing clinical staff in mind , may make it suited for implementation in a clinical setting , though additional work is needed to identify why question asking was unaffected and establish its impact later in the illness trajectory Importance Observational studies demonstrate links between patient-centered communication , quality of life ( QOL ) , and aggressive treatments in advanced cancer , yet few r and omized clinical trials ( RCTs ) of communication interventions have been reported . Objective To determine whether a combined intervention involving oncologists , patients with advanced cancer , and caregivers would promote patient-centered communication , and to estimate intervention effects on shared underst and ing , patient-physician relationships , QOL , and aggressive treatments in the last 30 days of life . Design , Setting , and Participants Cluster RCT at community- and hospital-based cancer clinics in Western New York and Northern California ; 38 medical oncologists ( mean age 44.6 years ; 11 ( 29 % ) female ) and 265 community-dwelling adult patients with advanced nonhematologic cancer participated ( mean age , 64.4 years , 146 [ 55.0 % ] female , 235 [ 89 % ] white ; enrolled August 2012 to June 2014 ; followed for 3 years ) ; 194 patients had participating caregivers . Interventions Oncologists received individualized communication training using st and ardized patient instructors while patients received question prompt lists and individualized communication coaching to identify issues to address during an upcoming oncologist visit . Both interventions focused on engaging patients in consultations , responding to emotions , informing patients about prognosis and treatment choices , and balanced framing of information . Control participants received no training . Main Outcomes and Measures The prespecified primary outcome was a composite measure of patient-centered communication coded from audio recordings of the first oncologist visit following patient coaching ( intervention group ) or enrollment ( control ) . Secondary outcomes included the patient-physician relationship , shared underst and ing of prognosis , QOL , and aggressive treatments and hospice use in the last 30 days of life . Results Data from 38 oncologists ( 19 r and omized to intervention ) and 265 patients ( 130 intervention ) were analyzed . In fully adjusted models , the intervention result ed in clinical ly and statistically significant improvements in the primary physician-patient communication end point ( adjusted intervention effect , 0.34 ; 95 % CI , 0.06 - 0.62 ; P = .02 ) . Differences in secondary outcomes were not statistically significant . Conclusions and Relevance A combined intervention that included oncologist communication training and coaching for patients with advanced cancer was effective in improving patient-centered communication but did not affect secondary outcomes . Trial Registration clinical trials.gov Identifier :
11,013
30,920,375
Evidence suggested that serious gaming was at least as effective as other digital education modalities for these outcomes . There was insufficient evidence to conclude whether one type of serious gaming/gamification intervention is more effective than any other . There was limited evidence for the effects of serious gaming/gamification on professional attitudes . Serious gaming/gamification appears to be at least as effective as controls , and in many studies , more effective for improving knowledge , skills , and satisfaction .
BACKGROUND There is a worldwide shortage of health workers , and this issue requires innovative education solutions . Serious gaming and gamification education have the potential to provide a quality , cost-effective , novel approach that is flexible , portable , and enjoyable and allow interaction with tutors and peers . OBJECTIVE The aim of this systematic review was to evaluate the effectiveness of serious gaming/gamification for health professions education compared with traditional learning , other types of digital education , or other serious gaming/gamification interventions in terms of patient outcomes , knowledge , skills , professional attitudes , and satisfaction ( primary outcomes ) as well as economic outcomes of education and adverse events ( secondary outcomes ) .
Introduction Virtual environments offer a variety of benefits and may be a powerful medium with which to provide nursing education . The objective of this study was to compare the achievement of learning outcomes of undergraduate nursing students when a virtual patient trainer or a traditional lecture was used to teach pediatric respiratory content . Methods This was a r and omized , controlled , posttest design . A virtual pediatric hospital unit was populated with four virtual pediatric patients having different respiratory diseases that were design ed to meet the same learning objectives as a traditional lecture . The study began in Spring 2010 with 93 Senior I , baccalaureate nursing students . Students were r and omized to receive either a traditional lecture or an experience with a virtual patient trainer . Students ’ knowledge acquisition was evaluated using multiple-choice questions , and knowledge application was measured as timeliness of care in two simulated clinical scenarios using high-fidelity mannequins and st and ardized patients . Results Ninety-three students participated in the study , of which 46 were in the experimental group that received content using the virtual patient trainer . After the intervention , students in the experimental group had significantly higher knowledge acquisition ( P = 0.004 ) and better knowledge application ( P = 0.001 ) for each of the two scenarios than students in the control group . Conclusions The purpose of this project was to compare a virtual patient trainer to a traditional lecture for the achievement of learning outcomes for pediatric respiratory content . Although the virtual patient trainer experience produced statistically better outcomes , the differences may not be clinical ly significant . The results suggest that a virtual patient trainer may be an effective substitute for the achievement of learning outcomes that are typically met using a traditional lecture format . Further research is needed to underst and how best to integrate a virtual patient trainer into undergraduate nursing education BACKGROUND An important facet of laparoscopic surgery is its psychomotor component . As this aspect of surgery gains attention , lessons from other psychomotor-intense fields such as athletics have led to an investigation of the benefits of " warming up " prior to entering the operating room . Practical implementation of established methods of warm-up is hampered by a reliance on special equipment and instrumentations that are not readily available . In light of emerging evidence of translatability between video-game play and operative performance , we sought to find if laparoscopic task performance improved after warming up on a mobile device balance game . MATERIAL S AND METHODS Laparoscopic novices were r and omized into either the intervention group ( n = 20 ) or the control group ( n = 20 ) . The intervention group played a mobile device balance game for 10 min while the control group did no warm-up whatsoever . Assessment was performed using two tasks on the ProMIS laparoscopic simulation system : " object positioning " ( where small beads are transferred between four cups ) and " tissue manipulation " ( where pieces of plastic are stretched over pegs ) . Metrics measured were time to task completion , path length , smoothness , h and dominance , and errors . RESULTS The intervention group made fewer errors : object positioning task 0.20 versus 0.70 , P = 0.01 , tissue manipulation task 0.15 versus 0.55 , P = 0.05 , total errors 0.35 versus 1.25 , P = 0.002 . The two groups performed similarly on the other metrics . CONCLUSIONS Warm-up using a mobile device balance game decreases errors on basic tasks performed on a laparoscopic surgery simulator , suggesting a practical way to warm-up prior to cases in the operating room OBJECTIVE The aim of this study was to assess , using a Web-based format , third-year medical students ' pediatric knowledge and perceptions of game playing with faculty facilitation compared with self- study computerized flash cards . METHODS This study used a repeated- measures experimental design with r and om assignment to a game group or self- study group . Pediatric knowledge was tested using multiple choice exams at baseline , week 6 of the clerkship following a 4-week intervention , and 6 weeks later . Perceptions about game playing and self- study were evaluated using a question naire at week 6 . RESULTS The groups did not differ on content mastery , perceptions about content , or time involved in game playing or self- study . Perceptions about game playing versus self- study as a pedagogical method appeared to favor game playing in underst and ing content ( P<.001 ) , perceived help with learning ( P<.05 ) , and enjoyment of learning ( P<.008 ) . An important difference was increased game group willingness to continue participating in the intervention . CONCLUSIONS Games can be an enjoyable and motivating method for learning pediatric content , enhanced by group interactions , competition , and fun . Computerized , Web-based tools can facilitate access to educational re sources and are feasible to apply as an adjunct to teaching clinical medicine Background Most patients with diabetes mellitus ( DM ) are followed by primary care physicians , who often lack knowledge or confidence to prescribe insulin properly . This contributes to clinical inertia and poor glycemic control . Effectiveness of traditional continuing medical education ( CME ) to solve that is limited , so new approaches are required . Electronic games are a good option , as they can be very effective and easily disseminated . Objective The objective of our study was to assess applicability , user acceptance , and educational effectiveness of InsuOnline , an electronic serious game for medical education on insulin therapy for DM , compared with a traditional CME activity . Methods Primary care physicians ( PCPs ) from South of Brazil were invited by phone or email to participate in an unblinded r and omized controlled trial and r and omly allocated to play the game InsuOnline , installed as an app in their own computers , at the time of their choice , with minimal or no external guidance , or to participate in a traditional CME session , composed by onsite lectures and cases discussion . Both interventions had the same content and duration ( ~4 h ) . Applicability was assessed by the number of subjects who completed the assigned intervention in each group . Insulin-prescribing competence ( factual knowledge , problem-solving skills , and attitudes ) was self-assessed through a question naire applied before , immediately after , and 3 months after the interventions . Acceptance of the intervention ( satisfaction and perceived importance for clinical practice ) was also assessed immediately after and 3 months after the interventions , respectively . Results Subjects ’ characteristics were similar between groups ( mean age 38 , 51.4 % [ 69/134 ] male ) . In the game group , 69 of 88 ( 78 % ) completed the intervention , compared with 65 of 73 ( 89 % ) in the control group , with no difference in applicability . Percentage of right answers in the competence subscale , which was 52 % at the baseline in both groups , significantly improved immediately after both interventions to 92 % in the game group and to 85 % in control ( P<.001 ) . After 3 months , it remained significantly higher than that at the baseline in both groups ( 80 % in game , and 76 % in control ; P<.001 ) . Absolute increase in competence score was better with the game ( 40 % ) than with traditional CME ( 34 % ; P=.01 ) . Insulin-related attitudes were improved both after the game ( significant improvement in 4 of 9 items ) and after control activity ( 3 of 9 ) . Both interventions were very well accepted , with most subjects rating them as “ fun or pleasant , ” “ useful , ” and “ practice -changing . ” Conclusions The game InsuOnline was applicable , very well accepted , and highly effective for medical education on insulin therapy . In view of its flexibility and easy dissemination , it is a valid option for large-scale CME , potentially helping to reduce clinical inertia and to improve quality of care for DM patients . Trial Registration Clinical trials.gov NCT001759953 ; https:// clinical trials.gov/ct2/show/NCT01759953 ( Archived by WebCite at http://www.webcitation.org/6oeHoTrBf Background Video-games have become an integral part of the new multimedia culture . Several studies assessed video-gaming enhancement of spatial attention and eye-h and coordination . Considering the technical difficulty of laparoscopic procedures , legal issues and time limitations , the validation of appropriate training even outside of the operating rooms is ongoing . We investigated the influence of a four-week structured Nintendo ® Wii ™ training on laparoscopic skills by analyzing performance metrics with a vali date d simulator ( Lap Mentor ™ , Simbionix ™ ) . Methodology /Principal Findings We performed a prospect i ve r and omized study on 42 post-graduate I – II year residents in General , Vascular and Endoscopic Surgery . All participants were tested on a vali date d laparoscopic simulator and then r and omized to group 1 ( Controls , no training with the Nintendo ® Wii ™ ) , and group 2 ( training with the Nintendo ® Wii ™ ) with 21 subjects in each group , according to a computer-generated list . After four weeks , all residents underwent a testing session on the laparoscopic simulator of the same tasks as in the first session . All 42 subjects in both groups improved significantly from session 1 to session 2 . Compared to controls , the Wii group showed a significant improvement in performance ( p<0.05 ) for 13 of the 16 considered performance metrics . Conclusions / Significance The Nintendo ® Wii ™ might be helpful , inexpensive and entertaining part of the training of young laparoscopists , in addition to a st and ard surgical education based on simulators and the operating room Written and clinical tests compared the change in clinical knowledge and practical clinical skill of first-year dental students watching a clinical video recording of the three-step etch- and -rinse resin bonding system to those using an interactive dental video game teaching the same procedure . The research design was a r and omized controlled trial with eighty first-year dental students enrolled in the pre clinical operative dentistry course . Students ' change in knowledge was measured through written examination using a pre-test and a post-test , as well as clinical tests in the form of a benchtop shear bond strength test . There was no statistically significant difference between teaching methods in regards to change in either knowledge or clinical skills , with one minor exception relating to the wetness of dentin following etching . Students expressed their preference for an interactive self-paced method of teaching We investigated if engagement modes and perceived self-efficacy differed in surgical novices before and after r and omized training in two different video games during five weeks , and a control group with no training . The control group expressed to a higher extent negative engagement modes during training in MIST-VR and GI Mentor II than the experimental groups . No statistically significant differences in self-efficacy were identified between groups . Both engagement modes and self-efficacy showed a positive correlation with previous and present video game experience . It is suggested that videogame training could have a framing effect on surgical simulator performance . EM and SE might be important intermediate variables between the strength of relationship between current videogame experience and simulator performance Cardiopulmonary resuscitation ( CPR ) is a first aid key survival technique used to stimulate breathing and keep blood flowing to the heart . Its effective administration can significantly increase the chances of survival for victims of cardiac arrest . LISSA is a serious game design ed to complement CPR teaching and also to refresh CPR skills in an enjoyable way . The game presents an emergency situation in a 3D virtual environment and the player has to save the victim applying the CPR actions . In this paper , we describe LISSA and its evaluation in a population composed of 109 nursing undergraduate students enrolled in the Nursing degree of our university . To evaluate LISSA we performed a r and omized controlled trial that compares the classical teaching methodology , composed of self-directed learning for theory plus laboratory sessions with a mannequin for practice , with the one that uses LISSA after self-directed learning for theory and before laboratory sessions with a mannequin . From our evaluation we observed that students using LISSA ( Group 2 and 3 ) gave significantly better learning acquisition scores than those following traditional classes ( Group 1 ) . To evaluate the differences between students of these groups we performed a paired sample s t-test between Group 1 and 2 ( μ1=35 , 67 , μ2=47 , 50 and p<0.05 ) and between students of Group 1 and 3 ( μ1=35 , 67 , μ3=50 , 58 and p<0.05 ) . From these tests we observed that there are significant differences in both cases . We also evaluated student performance of main steps of CPR protocol . Students that use LISSA performed better than the ones that did not use it Background —Many patients with high blood pressure ( BP ) do not have antihypertensive medications appropriately intensified at clinician visits . We investigated whether an online spaced-education ( SE ) game among primary care clinicians can decrease time to BP target among their hypertensive patients . Methods and Results —A 2-arm r and omized trial was conducted over 52 weeks among primary care clinicians at 8 hospitals . Educational content consisted of 32 vali date d multiple-choice questions with explanations on hypertension management . Providers were r and omized into 2 groups : SE clinicians were enrolled in the game , whereas control clinicians received identical educational content in an online posting . SE game clinicians were e-mailed 1 question every 3 days . Adaptive game mechanics resent questions in 12 or 24 days if answered incorrectly or correctly , respectively . Clinicians retired questions by answering each correctly twice consecutively . Posting of relative performance among peers fostered competition . Primary outcome measure was time to BP target ( < 140/90 mm Hg ) . One hundred eleven clinicians enrolled . The SE game was completed by 87 % of clinicians ( 48/55 ) , whereas 84 % of control clinicians ( 47/56 ) read the online posting . In multivariable analysis of 17 866 hypertensive periods among 14 336 patients , the hazard ratio for time to BP target in the SE game cohort was 1.043 ( 95 % confidence interval , 1.007–1.081 ; P=0.018 ) . The number of hypertensive episodes needed to treat to normalize one additional patient ’s BP was 67.8 . The number of clinicians needed to teach to achieve this was 0.43 . Conclusions —An online SE game among clinicians generated a modest but significant reduction in the time to BP target among their hypertensive patients . Clinical Trial Registration —URL : http://www . clinical trials.gov . Unique identifier : NCT00904007 We report on a pilot study that investigates the transfer effect of systematic computer game training on performance in image guided surgery . In a group of 22 surgical novices , subjects were matched and r and omized into one group training with a 3-D first person shooter ( FPS ) game and one group training with a 2-D non-FPS game . We also included a control group . Subjects were tested pre- and post training in the MIST-VR and GI-Mentor surgical simulators . We found that subjects with past experience specific to FPS games were significantly better in performing the simulated endoscopy task , both regarding time and efficiency of screening , compared to subjects lacking FPS game experience . Furthermore subjects who underwent systematic FPS game training performed better in the MIST-VR than those training with a 2-D game . Our findings indicate a transfer effect and that experience of video games are important for training outcome in simulated surgical procedures . Video game training can become useful when design ing future skills training curricula for surgeons Background Educational computer games are examples of computer-assisted learning objects , representing an educational strategy of growing interest . Given the changes in the digital world over the last decades , students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning method ologies to an active multisensory experimental learning methodology . The objective of this study was to compare a computer game-based learning method with a traditional learning method , regarding learning gains and knowledge retention , as means of teaching head and neck Anatomy and Physiology to Speech- Language and Hearing pathology undergraduate students . Methods Students were r and omized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received . Students ’ prior knowledge ( i.e. before undergoing the learning method ) , short-term knowledge retention and long-term knowledge retention ( i.e. six months after undergoing the learning method ) were assessed with a multiple choice question naire . Students ’ performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions . Results Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section . Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions . Conclusions The game-based learning method is comparable to the traditional learning method in general and in short-term gains , while the traditional lecture still seems to be more effective to improve students ’ short and long-term knowledge retention Anesthetic management of orthotopic liver transplantation ( OLT ) is complex . Given the unequal distributions of liver transplant surgeries performed at different centers , anesthesiology providers receive relatively uneven OLT training and exposure . One well-suited modality for OLT training is the " serious game , " an interactive application created for the purpose of imparting knowledge or skills , while leveraging the self-motivating elements of video games . We therefore developed a serious game design ed to teach best practice s for the anesthetic management of a st and ard OLT and determined if the game would improve resident performance in a simulated OLT . Forty-four residents on the liver transplant rotation were r and omized to either the gaming group ( GG ) or the control group ( CG ) prior to their introductory simulation . Both groups were given access to the same educational material s and literature during their rotation , but the GG also had access to the OLT Trainer . Performance on the simulations were recorded on a st and ardized grading rubric . Both groups experienced an increase in score relative to baseline that was statistically significant at every stage . The improvements in scores were greater for the GG participants than the CG participants . Overall score improvement between the GG and CG ( mean [ st and ard deviation ] ) was statistically significant ( GG , 7.95 [ 3.65 ] ; CG , 4.8 [ 4.48 ] ; P = 0.02 ) , as were scores for preoperative assessment ( GG , 2.67 [ 2.09 ] ; CG , 1.17 [ 1.43 ] ; P = 0.01 ) and anhepatic phase ( GG , 1.62 [ 1.01 ] ; CG , 0.75 [ 1.28 ] ; P = 0.02 ) . Of the residents with game access , 81 % were " very satisfied " or " satisfied " with the game overall . In conclusion , adding a serious game to an existing educational curriculum for liver transplant anesthesia result ed in significant learning gains for rotating anesthesia residents . The intervention was straightforward to implement and cost-effective . Liver Transplantation 23 430 - 439 2017 AASLD Background When compared with more traditional instructional methods , Game-based e-learning ( GbEl ) promises a higher motivation of learners by presenting contents in an interactive , rule-based and competitive way . Most recent systematic review s and meta- analysis of studies on Game-based learning and GbEl in the medical professions have shown limited effects of these instructional methods . Objectives To compare the effectiveness on the learning outcome of a Game-based e-learning ( GbEl ) instruction with a conventional script-based instruction in the teaching of phase contrast microscopy urinalysis under routine training conditions of undergraduate medical students . Methods A r and omized controlled trial was conducted with 145 medical students in their third year of training in the Department of Urology at the University Medical Center Freiburg , Germany . 82 subjects where allocated for training with an educational adventure-game ( GbEl group ) and 69 subjects for conventional training with a written script-based approach ( script group ) . Learning outcome was measured with a 34 item single choice test . Students ' attitudes were collected by a question naire regarding fun with the training , motivation to continue the training and self- assessment of acquired knowledge . Results The students in the GbEl group achieved significantly better results in the cognitive knowledge test than the students in the script group : the mean score was 28.6 for the GbEl group and 26.0 for the script group of a total of 34.0 points with a Cohen 's d effect size of 0.71 ( ITT analysis ) . Attitudes towards the recent learning experience were significantly more positive with GbEl . Students reported to have more fun while learning with the game when compared to the script-based approach . Conclusions Game-based e-learning is more effective than a script-based approach for the training of urinalysis in regard to cognitive learning outcome and has a high positive motivational impact on learning . Game-based e-learning can be used as an effective teaching method for self-instruction Simulation games are becoming increasingly popular in education , but more insight in their critical design features is needed . This study investigated the effects of fidelity of open patient cases in adjunct to an instructional e-module on students ’ cognitive skills and motivation . We set up a three-group r and omized post-test-only design : a control group working on an e-module ; a cases group , combining the e-module with low-fidelity text-based patient cases , and a game group , combining the e-module with a high-fidelity simulation game with the same cases . Participants completed question naires on cognitive load and motivation . After a 4-week study period , blinded assessors rated students ’ cognitive emergency care skills in two mannequin-based scenarios . In total 61 students participated and were assessed ; 16 control group students , 20 cases students and 25 game students . Learning time was 2 h longer for the cases and game groups than for the control group . Acquired cognitive skills did not differ between groups . The game group experienced higher intrinsic and germane cognitive load than the cases group ( p = 0.03 and 0.01 ) and felt more engaged ( p < 0.001 ) . Students did not profit from working on open cases ( in adjunct to an e-module ) , which nonetheless challenged them to study longer . The e-module appeared to be very effective , while the high-fidelity game , although engaging , probably distracted students and impeded learning . Medical educators design ing motivating and effective skills training for novices should align case complexity and fidelity with students ’ proficiency level . The relation between case-fidelity , motivation and skills development is an important field for further study Background Serious games have the potential to teach complex cognitive skills in an engaging way , at relatively low costs . Their flexibility in use and scalability makes them an attractive learning tool , but more research is needed on the effectiveness of serious games compared to more traditional formats such e-modules . We investigated whether undergraduate medical students developed better knowledge and awareness and were more motivated after learning about patient-safety through a serious game than peers who studied the same topics using an e-module . Methods Fourth-year medical students were r and omly assigned to either a serious game that included video-lectures , biofeedback exercises and patient missions ( n = 32 ) or an e-module , that included text-based lectures on the same topics ( n = 34 ) . A third group acted as a historical control-group without extra education ( n = 37 ) . After the intervention , which took place during the clinical introduction course , before the start of the first rotation , all students completed a knowledge test , a self-efficacy test and a motivation question naire . During the following 10-week clinical rotation they filled out weekly question naires on patient-safety awareness and stress . Results The results showed patient safety knowledge had equally improved in the game group and e-module group compared to controls , who received no extra education . Average learning-time was 3 h for the game and 1 h for the e-module-group . The serious game was evaluated as more engaging ; the e-module as more easy to use . During rotations , students in the three groups reported low and similar levels of patient-safety awareness and stress . Students who had treated patients successfully during game missions experienced higher self-efficacy and less stress during their rotation than students who treated patients unsuccessfully . Conclusions Video-lectures ( in a game ) and text-based lectures ( in an e-module ) can be equally effective in developing knowledge on specific topics . Although serious games are strongly engaging for students and stimulate them to study longer , they do not necessarily result in better performance in patient safety issues In this meta- analysis , we systematic ally review ed research on digital games and learning for K–16 students . We synthesized comparisons of game versus nongame conditions ( i.e. , media comparisons ) and comparisons of augmented games versus st and ard game design s ( i.e. , value-added comparisons ) . We used r and om-effects meta-regression models with robust variance estimates to summarize overall effects and explore potential moderator effects . Results from media comparisons indicated that digital games significantly enhanced student learning relative to nongame conditions ( g ¯ = 0.33 , 95 % confidence interval [ 0.19 , 0.48 ] , k = 57 , n = 209 ) . Results from value-added comparisons indicated significant learning benefits associated with augmented game design s ( g ¯ = 0.34 , 95 % confidence interval [ 0.17 , 0.51 ] , k = 20 , n = 40 ) . Moderator analyses demonstrated that effects varied across various game mechanics characteristics , visual and narrative characteristics , and research quality characteristics . Taken together , the results highlight the affordances of games for learning as well as the key role of design beyond medium Background Equipment-related malfunctions directly relate to one-fourth of the adverse events in the surgical theater . A serious game trains residents to recognize and respond to equipment problems in minimally invasive surgery ( MIS ) . These include disturbed vision , gas transport , electrocautery , and pathophysiological disturbances . This r and omized controlled trial explores whether game-based training improves surgical residents ’ response to equipment-related problems during surgery . Methods Thirty-one surgical residents with no previous experience in MIS took part in a st and ardized basic laparoscopy training course . Fifteen residents were r and omly assigned to the game-enhanced curriculum ( intervention ) and sixteen were assigned to the regular curriculum ( control ) . Participants performed a MIS task in a live anesthetized pig model , during which three st and ardized equipment malfunction scenarios occurred . Observers recorded the problems recognized and solved , time , and participants ’ technical performance . Results Twenty-four participants completed the post-test ( n = 12 per group ) . The intervention group solved more problems than the control group ( 59 vs. 33 % , p = 0.029 ) . The intervention group also recognized a larger proportion of problems , although this parameter was non-significant ( 67 vs. 42 % , p = 0.14 ) . R and om effects modeling showed a significant improved game performance per participant over time . Conclusions Surgical residents , who play for only 1 h on a custom-made serious game , respond significantly better to equipment-related problems during surgery than residents trained by a st and ard training curriculum . These results imply that entertaining serious games can indeed be considered for use in official training for surgeons and other medical specialists Objective : To assess the efficacy of a “ spaced-education ” game as a method of continuing medical education ( CME ) among physicians across the globe . Background : The efficacy of educational games for the CME has yet to be established . We created a novel online educational game by incorporating game mechanics into “ spaced education ” ( SE ) , an evidence -based method of online CME . Methods : This 34-week r and omized trial enrolled practicing urologists across the globe . The SE game consisted of 40 vali date d multiple-choice questions and explanations on urology clinical guidelines . Enrollees were r and omized to 2 cohorts : cohort A physicians were sent 2 questions via an automated e-mail system every 2 days , and cohort B physicians were sent 4 questions every 4 days . Adaptive game mechanics re-sent the questions in 12 or 24 days if answered incorrectly and correctly , respectively . Questions expired if not answered on time ( appointment dynamic ) . Physicians retired questions by answering each correctly twice-in-a-row ( progression dynamic ) . Competition was fostered by posting relative performance among physicians . Main outcome measures were baseline scores ( percentage of questions answered correctly upon initial presentation ) and completion scores ( percentage of questions retired ) . Results : A total of 1470 physicians from 63 countries enrolled . Median baseline score was 48 % ( interquartile range [ IQR ] 17 ) and , in multivariate analyses , was found to vary significantly by region ( Cohen dmax = 0.31 , P = 0.001 ) and age ( dmax = 0.41 , P < 0.001 ) . Median completion score was 98 % ( IQR 25 ) and varied significantly by age ( dmax = 0.21 , P < 0.001 ) and American Board of Urology certification ( d = 0.10 , P = 0.033 ) but not by region ( multivariate analyses ) . Question clustering reduced physicians ' performance ( d = 0.43 , P < 0.001 ) . Seventy-six percent of enrollees ( 1111/1470 ) requested to participate in future SE games . Conclusions : An online SE game can substantially improve guidlines knowledge and is a well-accepted method of global CME delivery INTRODUCTION The video game industry has become increasingly popular over recent years , offering photorealistic simulations of various scenarios while requiring motor , visual , and cognitive coordination . Video game players outperform nonplayers on different visual tasks and are faster and more accurate on laparoscopic simulators . The same qualities found in video game players are highly desired in surgeons . Our investigation aims to evaluate the effect of video game play on the development of fine motor and visual skills . Specifically , we plan to examine if h and held video devices offer the same improvement in laparoscopic skill as traditional simulators , with less cost and more accessibility . METHODS We performed an Institutional Review Board-approved study , including categorical surgical residents and preliminary interns at our institution . The residents were r and omly assigned to 1 of 3 study arms , including a traditional laparoscopic simulator , XBOX 360 gaming console , or Nintendo DS h and held gaming system . After an introduction survey and baseline timed test using a laparoscopic surgery box trainer , residents were given 6 weeks to practice on their respective consoles . At the conclusion of the study , the residents were tested again on the simulator and completed a final survey . RESULTS A total of 31 residents were included in the study , representing equal distribution of each class level . The XBOX 360 group spent more time on their console weekly ( 6 hours per week ) compared with the simulator ( 2 hours per week ) , and Nintendo groups ( 3 hours per week ) . There was a significant difference in the improvement of the tested time among the 3 groups , with the XBOX 360 group showing the greatest improvement ( p = 0.052 ) . The residents in the laparoscopic simulator arm ( n = 11 ) improved 4.6 seconds , the XBOX group ( n = 10 ) improved 17.7 seconds , and the Nintendo DS group ( n = 10 ) improved 11.8 seconds . Residents who played more than 10 hours of video games weekly had the fastest times on the simulator both before and after testing ( p = 0.05 ) . Most residents stated that playing the video games helped to ease stress over the 6 weeks and cooperative play promoted better relationships among colleagues . CONCLUSIONS Studies have shown that residents who engage in video games have better visual , spatial , and motor coordination . We showed that over 6 weeks , residents who played video games improved in their laparoscopic skills more than those who practice d on laparoscopic simulators . The accessibility of gaming systems is 1 of the most essential factors making these tools a good re source for residents . H and held games are especially easy to use and offer a readily available means to improve visuospatial and motor abilities BACKGROUND First experiences in the operating theatre with real patients are always stressful and intimidating for students . We hypothesized that a game-like simulation could improve perceptions and performance of novices . METHODS A videogame was developed , combining pictures and short videos , by which students are interactively instructed on acting at the surgical block . Moreover , the game includes detailed descriptive information . After playing , students are given feedback on their performance . A r and omized controlled trial was conducted with 132 nursing and medical students with no previous experience in surgery . Sixty two ( 47.0 % ) were allocated to a control group ( CG ) and 70 ( 53.0 % ) to an experimental group ( EG ) . Subjects in EG played the game the day prior to their first experience in the theatre ; CG had no access to the application . On the day after their experience at surgery , all students filled in a question naire in a 7-point Likert format collecting subjective data about their experience in the surgical block . Four constructs related to students ' feelings , emotions and attitudes were measured through self-reported subjective scales , i.e. C1 : fear to make mistakes , C2 : perceived knowledge on how to behave , C3 : perceived errors committed , and C4 : attitude/behaviour towards patients and staff . The main research question was formulated as follows : do students show differences in constructs C1-C4 by exposure to the game ? RESULTS EG reported statistically significant higher scores on the four aspects measured than CG ( p<0.05 ; Mann-Whitney U tests ; Cohen 's d st and ardized effect size d1=0.30 ; d2=1.05 ; d3=0.39 ; d4=0.49 ) . CONCLUSIONS Results show clear evidence that the exposure to the game-like simulation had a significant positive effect on all the constructs . After their first visit to the theatre , students in EG showed less fear ( C1 ) and also perceived to have committed fewer errors ( C3 ) , while they showed higher perceived knowledge ( C2 ) and a more collaborative attitude ( C4 ) OBJECTIVE By exploiting video games technology , serious games strive to deliver affordable , accessible and usable interactive virtual worlds , supporting applications in training , education , marketing and design . The aim of the present study was to evaluate the effectiveness of such a serious game in the teaching of major incident triage by comparing it with traditional training methods . DESIGN Pragmatic controlled trial . METHOD During Major Incident Medical Management and Support Courses , 91 learners were r and omly distributed into one of two training groups : 44 participants practice d triage sieve protocol using a card-sort exercise , whilst the remaining 47 participants used a serious game . Following the training sessions , each participant undertook an evaluation exercise , whereby they were required to triage eight casualties in a simulated live exercise . Performance was assessed in terms of tagging accuracy ( assigning the correct triage tag to the casualty ) , step accuracy ( following correct procedure ) and time taken to triage all casualties . Additionally , the usability of both the card-sort exercise and video game were measured using a question naire . RESULTS Tagging accuracy by participants who underwent the serious game training was significantly higher than those who undertook the card-sort exercise [ Chi2=13.126 , p=0.02 ] . Step accuracy was also higher in the serious game group but only for the numbers of participants that followed correct procedure when triaging all eight casualties [ Chi2=5.45 , p=0.0196 ] . There was no significant difference in time to triage all casualties ( card-sort=435+/-74 s vs video game=456+/-62 s , p=0.155 ) . CONCLUSION Serious game technologies offer the potential to enhance learning and improve subsequent performance when compared to traditional educational methods Adequate cardiopulmonary resuscitation ( CPR ) skill is essential in improving survival rate of sudden cardiac arrest ( SCA ) . However , the skill deteriorates rapidly following CPR training . We developed a computer game by using 3-Dimensional virtual technology ( 3-D CPR game ) for laypersons in the purpose to improve skill retention . As the testing phase , a r and omized control trial , in which we recruited 97 freshman medical students who had no prior CPR training experience , was used to test its effect on 3-month CPR Skill retention . The usability of the game was also tested using a 33 item question naire rated with 5-point Likert scale . Three months after the initial CPR training , the retention rate of CPR skill in the game group was significantly higher compared with the control ( p<0.05 ) and the average score on 4 dimensions of usability were 3.99 - 4.05 . Overall , using 3-D CPR game in improving CPR skill retention is feasible and effective OBJECTIVE Medical students often lack training in complex geriatric medical decision making . We therefore developed the serious game , GeriatriX , for training medical decision making with weighing patient preferences , and appropriateness and costs of medical care . We hypothesized that education with GeriatriX would improve the ability to deal with geriatric decision making and also increase cost consciousness . DESIGN A r and omized , controlled pre-post measurement design . PARTICIPANTS Fifth-year medical students . INTERVENTION Playing the serious game GeriatriX as an additive to usual geriatric education . MEASUREMENTS We evaluated the effects of playing GeriatriX on self-perceived knowledge of geriatric themes and the self-perceived competence of weighing patient preferences , appropriateness , and costs of medical care in geriatric decision making . Cost consciousness was evaluated with a postmeasurement to estimate costs of different diagnostic tests . RESULTS There was a large positive increase in the self-perceived competence of weighing patient preferences , appropriateness , and costs of medical care in the intervention group ( n = 71 ) ( effect sizes of 0.7 , 1.0 , and 1.2 , respectively ) , which was significantly better for the last 2 aspects than in the control group ( n = 63 ) . The intervention group performed better on cost consciousness . Although the self-perceived knowledge increased substantially on some geriatric topics , this improvement was not different between the intervention and control groups . CONCLUSIONS After playing the serious game , GeriatriX , medical students have a higher self-perceived competence in weighing patient preferences , appropriateness , and costs of medical care in complex geriatric medical decision making . Playing GeriatriX also result ed in better cost consciousness . We therefore encourage wider use of GeriatriX to teach geriatrics in medical curricula and its further research on educational and health care outcomes BACKGROUND The playing of video games ( VGs ) was previously shown to improve surgical skills . This is the first r and omized , controlled study to assess the impact of VG genre on the development of basic surgical skills . MATERIAL S AND METHODS Twenty first-year , surgically inexperienced medical students attended a practical course on surgical knots , suturing , and skin-flap technique . Later , they were r and omized into four groups : control and /or nongaming ( ContG ) , first-person-shooter game ( ShotG ) , racing game ( RaceG ) , and surgery game ( SurgG ) . All participants had 3 wk of Nintendo Wii training . Surgical and VG performances were assessed by two independent , blinded surgeons who evaluated basal performance ( time 0 ) and performance after 1 wk ( time 1 ) and 3 wk ( time 2 ) of training . RESULTS The training time of RaceG was longer than that of ShotG and SurgG ( P = 0.045 ) . Compared to SurgG and RaceG , VG scores for ShotG improved less between times 0 and 1 ( P = 0.010 ) but more between times 1 and 2 ( P = 0.004 ) . Improvement in mean surgical performance scores versus time differed in each VG group ( P = 0.011 ) . At time 2 , surgical performance scores were significantly higher in ShotG ( P = 0.002 ) and SurgG ( P = 0.022 ) than in ContG. The surgical performance scores of RaceG were not significantly different from the score achieved by ContG ( P = 0.279 ) . CONCLUSIONS Different VG genres may differentially impact the development of surgical skills by medical students . More complex games seem to improve performance even if played less . Although further studies are needed , surgery-related VGs with sufficient complexity and playability could be a feasible adjuvant to improving surgical skills Background Previous studies have shown a correlation between previous video game experience and performance in minimally invasive surgical simulators . The hypothesis is that systematic video game training with high visual-spatial dem and s and visual similarity to endoscopy would show a transfer effect on performance in virtual reality endoscopic surgical simulation . Methods A prospect i ve r and omized study was performed . Thirty surgical novices were matched and r and omized to five weeks of systematic video game training in either a first-person shooter game ( Half Life ) with high visual-spatial dem and s and visual similarities to endoscopy or a video game with mainly cognitive dem and s ( Chessmaster ) . A matched control group ( n = 10 ) performed no video game training during five weeks . Performance in two virtual reality endoscopic surgical simulators ( MIST-VR and GI Mentor II ) was measured pre- and post-training . Before simulator training we also controlled for students ’ visual-spatial ability , visual working memory , age , and previous video game experience . Results The group training with Half Life showed significant improvement in two GI Mentor II variables and the MIST-VR task MD level medium . The group training with Chessmaster only showed an improvement in the MIST-VR task . No effect was observed in the control group . As recently shown in other studies , current and previous video game experience was important for simulator performance . Conclusions Systematic video game training improved surgical performance in advanced virtual reality endoscopic simulators . The transfer effect increased when increasing visual similarity . The performance in intense , visual-spatially challenging video games might be a predictive factor for the outcome in surgical simulation BACKGROUND Preparing nursing students for the knowledge and skills required for the administration and monitoring of blood components is crucial for entry into clinical practice . Serious games create opportunities to develop this competency , which can be used as a self-directed learning strategy to complement existing didactic learning and simulation-based strategies . AIM To describe the development and evaluation of a serious game to improve nursing students ' knowledge , confidence , and performance in blood transfusion . METHOD An experiential gaming model was applied to guide the design of the serious game environment . A clustered , r and omized controlled trial was conducted with 103 second-year undergraduate nursing students who were r and omized into control or experimental groups . After a baseline evaluation of the participants ' knowledge and confidence on blood transfusion procedure , the experimental group undertook a blood transfusion serious game and completed a question naire to evaluate their learning experience . All participants ' clinical performances were evaluated in a simulated environment . RESULTS The post-test knowledge and confidence mean scores of the experimental group improved significantly ( p<0.001 ) after the serious game intervention compared to pre-test mean scores and to post-test mean scores of the control group ( p<0.001 ) . However , no significance difference ( p=0.11 ) was found between the experimental and control groups on the post-test performance mean scores . The participants evaluated the serious game positively . CONCLUSION The study provided evidence on the effectiveness of a serious game in improving the knowledge and confidence of nursing students on blood transfusion practice . The features of this serious game could be further developed to incorporate additional scenarios with repetitive exercises and feedback to enhance the impact on clinical performance . Given the flexibility , practicality , and scalability of such a game , they can serve as a promising approach to optimize learning when blended with high-fidelity simulation Synthesizing evidence from r and omized controlled trials of digital health education poses some challenges . These include a lack of clear categorization of digital health education in the literature ; constantly evolving concepts , pedagogies , or theories ; and a multitude of methods , features , technologies , or delivery setting s. The Digital Health Education Collaboration was established to evaluate the evidence on digital education in health professions ; inform policymakers , educators , and students ; and ultimately , change the way in which these professionals learn and are taught . The aim of this paper is to present the overarching methodology that we use to synthesize evidence across our digital health education review s and to discuss challenges related to the process . For our research , we followed Cochrane recommendations for the conduct of systematic review s ; all review s are reported according to the PRISMA ( Preferred Reporting Items for Systematic Review s and Meta-Analyses ) guidance . This included assembling experts in various digital health education fields ; identifying gaps in the evidence base ; formulating focused research questions , aims , and outcome measures ; choosing appropriate search terms and data bases ; defining inclusion and exclusion criteria ; running the search es jointly with librarians and information specialists ; managing abstract s ; retrieving full-text versions of papers ; extracting and storing large data sets , critically appraising the quality of studies ; analyzing data ; discussing findings ; drawing meaningful conclusions ; and drafting research papers . The approach used for synthesizing evidence from digital health education trials is commonly regarded as the most rigorous benchmark for conducting systematic review s. Although we acknowledge the presence of certain biases ingrained in the process , we have clearly highlighted and minimized those biases by strictly adhering to scientific rigor , method ological integrity , and st and ard operating procedures . This paper will be a valuable asset for research ers and method ologists undertaking systematic review s in digital health education
11,014
27,544,827
Moderate- quality evidence showed that the Seldinger technique has a higher primary implantation success rate compared with the venous cutdown technique . Moderate- quality evidence showed no difference in the overall complication rate between the Seldinger and venous cutdown techniques . However , when the Seldinger technique with subclavian vein access was compared with the venous cutdown group , there was a higher reported incidence of catheter complications . The rates of pneumothorax and infection did not differ between the Seldinger and venous cutdown group .
BACKGROUND Totally implantable venous access ports ( TIVAPs ) provide patients with a safe and permanent venous access , for instance in the administration of chemotherapy for oncology patients . There are several methods for TIVAP placement , and the optimal evidence -based method is unclear . OBJECTIVES To compare the efficacy and safety of three commonly used techniques for implanting TIVAPs : the venous cutdown technique , the Seldinger technique , and the modified Seldinger technique . This review includes studies that use Doppler or real-time two-dimensional ultrasonography for locating the vein in the Seldinger technique .
STUDY OBJECTIVE To compare percutaneous nonangiographic insertion of a venous access device with a st and ard surgical cutdown insertion technique . DESIGN Prospect i ve , controlled , r and omized study . SETTING Operating room and anesthesia induction room of a university hospital . PATIENTS 100 consecutive oncology patients scheduled for intravenous chemotherapy . INTERVENTIONS Patients were r and omized to two groups : ( 1 ) The percutaneous group received implantation through the internal jugular vein by experienced anesthesiologists , whereas ( 2 ) the surgical group received venous cutdown insertion through the cephalic or subclavian vein by surgeons ( n = 50 for each group ) . MEASUREMENTS Duration of procedure , long-term device function , complications such as hematoma formation , infection , hemothorax , pneumothorax , and patients ' satisfaction with the placement procedure at two months of follow-up were all measured and recorded . MAIN RESULTS The percutaneous technique was found to have several advantages , including reduced time for insertion and greater patient satisfaction with procedure . The percutaneously implanted devices also had fewer insertion-associated complications . CONCLUSION The simplified , percutaneous , nonangiographic technique is as effective as the traditional venous cutdown technique and can be safely done by surgeons as well as by experienced physicians who are not surgeons BACKGROUND Central venous access is extensively used in oncology , though practical information from r and omized trials on the most convenient insertion modality and site is unavailable . METHODS Four hundred and three patients eligible for receiving i.v . chemotherapy for solid tumors were r and omly assigned to implantation of a single type of port ( Bard Port , Bard Inc. , Salt Lake City , UT ) , through a percutaneous l and mark access to the internal jugular , a ultrasound (US)-guided access to the subclavian or a surgical cut-down access through the cephalic vein at the deltoid-pectoralis groove . Early and late complications were prospect ively recorded until removal of the device , patient 's death or ending of the study . RESULTS Four hundred and one patients ( 99.9 % ) were assessable : 132 with the internal jugular , 136 with the subclavian and 133 with the cephalic vein access . The median follow-up was 356.5 days ( range 0 - 1087 ) . No differences were found for early complication rate in the three groups { internal jugular : 0 % [ 95 % confidence interval ( CI ) 0.0 % to 2.7 % ] , subclavian : 0 % ( 95 % CI 0.0 % to 2.7 % ) , cephalic : 1.5 % ( 95 % CI 0.1 % to 5.3%)}. US-guided subclavian insertion site had significantly lower failures ( e.g. failed attempts to place the catheter in agreement with the original arm of r and omization , P = 0.001 ) . Infections occurred in one , three and one patients ( internal jugular , subclavian and cephalic access , respectively , P = 0.464 ) , whereas venous thrombosis was observed in 15 , 8 and 11 patients ( P = 0.272 ) . CONCLUSIONS Central venous insertion modality and sites had no impact on either early or late complication rates , but US-guided subclavian insertion showed the lowest proportion of failures Background The insertion of central venous access devices , such as totally implantable venous access ports ( TIVAPs ) , is routine in patients who need a safe and permanent venous access . The number of port implantations is increasing due to the development of innovative adjuvant and neo-adjuvant therapies . Currently , two different strategies are being routinely used : surgical cut-down of the cephalic vein ( vena section ) and direct puncture of the subclavian vein . The aim of this trial is to identify the strategy for the implantation of TIVAPs with the lowest risk of pneumothorax and haemothorax . Methods / Design The PORTAS-3 trial is design ed as a multicentre , r and omised controlled trial to compare two implantation strategies . A total of 1,154 patients will be r and omised after giving written informed consent . Patients must be over 18 years of age and scheduled for primary implantation of a TIVAP on the design ated side . The primary endpoint will be the frequency of pneumothorax and haemothorax after insertion of a TIVAP by one of two different strategies . The experimental intervention is as follows : open strategy , defined as surgical cut-down of the cephalic vein , supported by a rescue technique if necessary , and in the case of failure , direct puncture of the subclavian vein . The control intervention is as follows : direct puncture of the subclavian vein using the Seldinger technique guided by sonography , fluoroscopy or l and mark technique . The trial duration is approximately 36 months , with a recruitment period of 18 months and a follow-up period of 30 days . Discussion The PORTAS-3 trial will compare two different TIVAP implantation strategies with regard to their individual risk of postoperative pneumothorax and haemothorax . Since TIVAP implantation is one of the most common procedures in general surgery , the results will be of interest for a large community of surgeons as well as oncologists and general practitioners . The pragmatic trial design ensures that the results will be generalizable to a wide range of patients .Trial registration The trial protocol was registered on 28 August 2014 with the German Clinical Trials Register ( DRKS00004900 ) . The World Health Organization ’s Universal Trial Number is U1111 - 1142 - 4420 The aim of this r and omized controlled study was to compare the primary success rate between venous cutdown and the Seldinger technique for placement of the totally implantable venous access port ( TIVAP ) Permanent central venous access devices ( PCVAD ) are used widely in the management of chronically ill patients , particularly in neoplastic diseases . The st and ard approach consists of positioning the catheter in the superior vena cava ( SVC ) either using subclavian or internal jugular vein puncture , or cephalic or external jugular vein cut-down , with the port implanted in a subcutaneous pouch of the thoracic region . Alternative insertion sites could be used in selected cases . In our experience , consisting of 158 PCVAD , 12 cases required a different insertion site : six cases of an SVC catheter and port on the forearm using a basilic vein cut-down , and six cases of an inferior vena cava ( IVC ) catheter and port in the abdominal region using a great saphenous vein cut-down . Comparing st and ard to alternative approaches , we observed a total morbidity rate of 8.9 % and 8.3 % , respectively ( P = NS ) , while the explant rate was 5.4 % vs 8.3 % ( P=0.1 ) . Our data show non-significant differences in morbidity and explant rates between the two groups of patients . Alternative insertion sites for the PCVAD implant seem to be a valid possibility in the management of chronically ill patients BACKGROUND This prospect i ve r and omized study evaluated complications related to long-term totally implantable catheters in oncologic children and adolescents by comparing venopunction performed either in the jugular or subclavian vein . METHODS A total of 83 catheters were implanted from January 2004 to April 2006 and followed-up until March 2008 . Patients were r and omly allocated to the subclavian or jugular vein group . The endpoint was complications that led to catheter revision or catheter removal . RESULTS Six patients were excluded , 43 had the catheter implanted in the subclavian and 34 in the jugular vein . Subclavian catheters were used for up to 12.6 months , while jugular catheters were kept in place for up to 14.8 months ( P = 0.38 ) . No statistical differences were found between the groups concerning age , sex , leukocyte count , platelet count , type of admission ( in or outpatient ) , or previous chemotherapy regimens . When analyzed individually , long-term complications did not present statistically significant differences either . Infection occurred in 20 and 11 % ( P = 0.44 ) , while catheter embolism took place in 23 and 8 % ( P = 0.11 ) of patients with subclavian and jugular catheters , respectively . A statistical difference was seen in the total number of complications , which occurred in 48 and 23 % ( P = 0.02 ) of patients in the subclavian and in the jugular groups , respectively . CONCLUSIONS Catheters implanted by puncture in the subclavian vein were more prone to late complications than those implanted in the jugular vein Totally implantable venous access ports are valuable instruments for long-term intravenous treatment of patients with cancer , but implantation and use of these devices may be associated with complications . The aim of our study was to compare two implantation techniques in order to establish which one is better for the patient and the surgeon as regards morbidity , surgical time , tolerability , and costs . A prospect i ve study was conducted on a series of 99 patients undergoing implantation of totally implantable venous access ports with surgical cut-down or percutaneous access from January 2000 to June 2004 at the Department of Surgical Sciences , Organ Transplantation and Advanced Technologies . Our experience shows that there are no statistically significant differences between these two techniques in terms of associated morbidity , technical failure , operative time and patient acceptance PURPOSE To examine the safety , efficacy , costs , and impact on quality of life of venous access ports implanted at the outset of a course of intravenous cancer chemotherapy . PATIENTS AND METHODS Adults beginning a course of intravenous chemotherapy at two university-affiliated hospitals were r and omly allocated to have venous access using a surgically implanted venous access port ( Port-a-Cath ; Pharmacia , Canada Inc , Montreal , Québec , Canada ) or using st and ard peripheral venous access . All accesses were documented by number , route , purpose , and procedure duration . Outcome measurements included port complications , access strategy failure , access-related anxiety and pain , quality of life ( Functional Living Index-Cancer [ FLI-C ] ) , and costs . RESULTS Port complication rates were low ( 0.23/1,000 days ) . Failure occurred in two ( 3.4 % ) of 59 port subjects and 16 ( 26.7 % ) of 60 controls ( P = . 0004 ) at a median period of 26 days after r and omization ( 95 % confidence interval , 8 to 92 ) . Peripheral accesses in port subjects took less time , had less access-related anxiety and pain , and were less costly to perform than in controls . Allocation had no effect on FLI-C scores . Peripheral access failure correlated with allocation to the control group ( P = .007 ) , higher pain scores with intravenous ( IV ) starts ( P = .003 ) , and anxiety with IV starts ( P = .01 ) . Venous accessing overall in port patients was four times more costly than that in controls ( $ 2,178/patient v $ 530/patient , respectively ) . CONCLUSION Ports were safe and effective but had no detectable impact on functional quality of life , despite less access-related anxiety , pain , and discomfort . Because only approximately one quarter of control patients ultimately required central venous access , economic considerations suggest that port-use policies should be based upon defined criteria of need Objective : Comparison of two different insertion techniques for implantation of totally implantable access ports ( TIAP ) . Background : TIAP are introduced through different open and closed cannulation strategies and by various medical experts . The aim of this expertise-based r and omized trial was to compare venous cutdown approach with puncture of subclavian vein . Methods : One hundred and ten patients scheduled for primary implantation of a TIAP were r and omly assigned to either open insertion technique performed by surgeons or puncture of the subclavian vein under fluoroscopic guidance by radiologists at an outpatient single university center . The primary endpoint was the primary success rate of the cannulation strategy . A logistic regression model was used for analysis adjusting for age , Karnofsky index , body mass index and surgeons ' , and the radiologists ' experience . Results : Percutaneus cannulation was not superior to surgical venous cutdown in the intention-to-treat analysis ( odds ratio , 0.37 ; 95 % CI , 0.07 ; 2.15 ) and the as-treated analysis ( odds ratio , 0.16 ; 95 % CI , 0 ; 1.28 ) . The procedure was shorter with surgery ( median , 21 minutes ; 95 % CI , 14 ; 30 ) than with radiology ( median , 45 minutes ; 95 % CI , 43 ; 50 ) ( P < 0.001 ) , and the dose of radiation was lower with surgery ( median , 37 cGy/cm2 ; 95 % CI , 26 ; 49 ) than with radiology ( 200 cGy/cm2 ; 95 % CI , 200 ; 300 ) ( P < 0.001 ) . Conclusion : Central venous cannulation for insertion of TIAPs can be performed safely and effectively with both approaches . The open direct surgical access requires further strategies for successful placement of a TIAP , and percutaneous Seldinger technique requires more time and a higher dose of radiation and is associated with risk of pneumothorax
11,015
24,575,725
The findings suggest that it is possible to reduce loneliness by using educational interventions focused on social networks maintenance and enhancement . Multiple approaches show promise , although flawed design often prevents proper evaluation of efficacy .
Objectives . Loneliness is common among older persons and has been associated with health and mental health risks . This systematic review examines the utility of loneliness interventions among older persons .
AIM This paper is a report of a study to explore the effects of psychosocial group nursing intervention on older people 's feelings of loneliness , social activity and psychological well-being . BACKGROUND Older people 's loneliness is associated with low quality of life , and impaired health , increased use of health and social services and increased mortality . Previous intervention studies have achieved quite modest results . METHOD A r and omized controlled trial was conducted between 2003 and 2006 using a group intervention aim ed at empowering older people , and promoting peer support and social integration . A total of 235 people ( > 74 years ) suffering from loneliness met 12 times with professional leaders in groups . The UCLA Loneliness Scale and Lubben 's Social Network Scale were used at entry , after 3 and 6 months . Psychological well-being was charted using a six-dimensional question naire at baseline and 12 months later . FINDINGS A statistically significantly larger proportion of intervention group participants had found new friends during the follow-up year ( 45 % vs. 32 % , P = 0.048 ) , and 40 % of intervention group participants continued their group meetings for 1 year . However , no differences were found in loneliness or social networks between the groups . Psychological well-being score improved statistically significantly in the intervention groups [ + 0.11 , 95 % confidence interval ( CI ) : + 0.04 to + 0.13 ] , compared with the controls ( + 0.01 , 95 % CI : -0.05 to + 0.07 , P = 0.045 ) . Feeling needed was statistically significantly more common in the intervention groups ( 66 % ) than in controls ( 49 % , P = 0.019 ) . CONCLUSION New sensitive measurements of loneliness and social isolation are needed to measure fluctuations in feelings of loneliness and in social isolation BACKGROUND Animal-assisted therapy ( AAT ) is cl aim ed to have a variety of benefits , but almost all published results are anecdotal . We characterized the resident population in long-term care facilities desiring AAT and determined whether AAT can objective ly improve loneliness . METHODS Of 62 residents , 45 met inclusion criteria for the study . These 45 residents were administered the Demographic and Pet History Question naire ( DPHQ ) and Version 3 of the UCLA Loneliness Scale ( UCLA-LS ) . They were then r and omized into three groups ( no AAT ; AAT once/week ; AAT three times/week ; n = 15/group ) and retested with the UCLA-LS near the end of the 6-week study . RESULTS Use of the DPHQ showed residents volunteering for the study had a strong life-history of emotional intimacy with pets and wished that they currently had a pet . AAT was shown by analysis of covariance followed by pairwise comparison to have significantly reduced loneliness scores in comparison with the no AAT group . CONCLUSIONS The desire for AAT strongly correlates with previous pet ownership . AAT reduces loneliness in residents of long-term care facilities Background Social isolation affects a significant proportion of older people and is associated with poor health outcomes . The current evidence base regarding the effectiveness of interventions targeting social isolation is poor , and the potential utility of mentoring for this purpose has not previously been rigorously evaluated . The purpose of this study was to examine the effectiveness of a community-based mentoring service for improving mental health , social engagement and physical health for socially isolated older people . Methods This prospect i ve controlled trial compared a sample of mentoring service clients ( intervention group ) with a matched control group recruited through general practice . One hundred and ninety five participants from each group were matched on mental wellbeing and social activity scores . Assessment s were conducted at baseline and at six month follow-up . The primary outcome was the Short Form Health Survey v2 ( SF-12 ) mental health component score ( MCS ) . Secondary outcomes included the SF-12 physical health component score ( PCS ) , EuroQol EQ-5D , Geriatric Depression Score ( GDS-10 ) , social activity , social support and morbidities . Results We found no evidence that mentoring was beneficial across a wide range of participant outcomes measuring health status , social activity and depression . No statistically significant between-group differences were observed at follow-up in the primary outcome ( p = 0.48 ) and in most secondary outcomes . Identifying suitable matched pairs of intervention and control group participants proved challenging . Conclusions The results of this trial provide no substantial evidence supporting the use of community mentoring as an effective means of alleviating social isolation in older people . Further evidence is needed on the effectiveness of community-based interventions targeting social isolation . When using non-r and omised design s , there are considerable challenges in the recruitment of suitable matches from a community sample .Trial registration SCIE Research Register for Social Care OBJECTIVE The impact of depression and perceived loneliness in the oldest old is largely unknown . The authors studied the relationship between the presence of depressive symptoms and all-cause mortality in old age , especially the potential distorting effect of perceived loneliness . METHOD Within a prospect i ve population -based study of 85-year-olds , the 15-item Geriatric Depression Scale and the Loneliness Scale were annually applied in all 476 participants with a Mini-Mental State Examination score of 18 points or more . RESULTS Depression was present in 23 % and associated with marital state , institutionalization , and perceived loneliness . When depression and perceived loneliness were assessed during follow-up , neither depression nor perceived loneliness had a significant effect on mortality . However , those who suffered from both depression and feelings of loneliness had a 2.1 times higher mortality risk . CONCLUSIONS The data suggest that the increased mortality risk attributable to depression in the presence of perceived loneliness may result from motivational depletion In the present r and omized controlled trial ( RCT ) it was investigated whether single women , 55 years of age and older , improved with regard to self-management ability , well-being , and social and emotional loneliness after having participated in a newly design ed self-management group intervention based on the Self-Management of Well-being ( SMW ) theory . The expected mediating effect of self-management ability on well-being was not found . Although self-management ability , well-being and loneliness improved significantly in the intervention group immediately after the intervention , and also remained at this improved level after six months , there was also improvement in the control group after six months , rendering the longer-term differences between the groups non-significant . It can , however , be concluded that , although the longer-term effectiveness could not be proven , this SMW theory-based intervention seems to be useful in supporting older women to improve their self-management ability and well-being The aim of the study was to examine the prevalence and self-reported causes of loneliness among Finnish older population . The data were collected with a postal question naire from a r and om sample of 6,786 elderly people ( > or=75 years of age ) . The response rate was 71.8 % from community-dwelling sample . Of the respondents , 39 % suffered from loneliness , 5 % often or always . Loneliness was more common among rural elderly people than those living in cities . It was associated with advancing age , living alone or in a residential home , widowhood , low level of education and poor income . In addition , poor health status , poor functional status , poor vision and loss of hearing increased the prevalence of loneliness . The most common subjective causes for loneliness were illnesses , death of a spouse and lack of friends . Loneliness seems to derive from societal life changes as well as from natural life events and hardships originating from aging Background : Seniors aged 75 and above have the highest suicide rates of all age groups in most industrialized countries . However , research concerning risk factors for suicide in the old elderly is sparse . Objective : The purpose was to determine predictors for suicide among the old elderly ( 75 + ) . Data concerning the young elderly ( 65–74 years ) are shown for comparison . Methods : 85 consecutive cases of suicide that occurred in western Sweden and 153 control persons with the same sex , birth year , and zip code as the suicide cases were r and omly selected from the tax register . The old elderly group included 38 cases and 71 controls ; the young elderly group included 47 cases and 82 controls . Data concerning the suicide cases were collected through interviews with close informants ; controls were interviewed in person . The interview included questions on past-year life events and mental and physical health . Medical records were review ed for cases and controls . The Cumulative Illness Rating Scale – Geriatrics was used to rate illness burden . Results : Family conflict , serious physical illness , loneliness , and both major and minor depressions were associated with suicide in the 75 + group . Economic problems predicted suicide in the younger but not in the older elderly . Old elderly suicide victims with depression ( major or minor ) were less likely to have received depression treatment than their younger counterparts . Conclusions : Better recognition and treatment of both major and minor depression should constitute an important target for the prevention of suicide in the old elderly . Intervention studies with large numbers of senior participants are sorely needed Objective : The objective of this trial , the Leiden 85-Plus Occupational Therapy Intervention Study ( LOTIS ) , was to assess whether unsolicited occupational therapy , as compared to no therapy , can decelerate the increase in disability in high-risk elderly people . Design : This was a r and omised controlled trial with 2-y follow-up . Setting : The study took place in the municipality of Leiden in the Netherl and s. Participants : The participants were 402 community-dwelling 85-y-old people , with a Mini-Mental State Examination score of > 18 points at baseline . Interventions : Participants in the intervention group were visited by an occupational therapist who provided training and education about assistive devices that were already present and who gave recommendations and information about procedures , possibilities , and costs of assistive devices and community-based services . Control participants were not visited by an occupational therapist . Outcome Measures : The primary outcome measure was the score achieved on the Groningen Activity Restriction Scale . Secondary outcome measures included self-evaluations of well-being and feelings of loneliness . Results : The participants were evenly divided between the two groups : 202 participants were allocated to the intervention group and 200 participants to the control group . Of the 202 participants r and omised to occupational therapy , 55 participants declined the proposed intervention . An occupational therapist indicated that of the remaining 147 participants , 66 ( 45 % ) needed an occupational therapy intervention . A total of 44 new assistive devices and five community-based services were implemented . During follow-up there was a progressive increase in disability in the intervention group ( mean annual increase , 2.0 points ; SE 0.2 ; p < 0.001 ) and control group ( mean annual increase , 2.1 points ; SE 0.2 ; p < 0.001 ) . The increase in disability was not significantly different between study groups ( 0.08 points ; 95 % CI , −1.1–1.2 ; p = 0.75 ) . There was also no difference between study groups for any of the secondary outcome measures . Conclusion : Unsolicited occupational therapy in high-risk elderly participants does not decelerate the increase in disability over time Objectives : This study examined relations between social isolation , loneliness , and social support to health outcomes in a sample of New Mexico seniors . Method : We used r and om-digit dialing to obtain a r and om sample of 755 southern New Mexico seniors . Participants answered questions pertaining to demographics , social isolation and loneliness , social support , and disease diagnosis including diabetes , hypertension , heart disease , liver disease , arthritis , emphysema , tuberculosis , kidney disease , cancer , asthma , and stroke . The sample allowed for comparison of Caucasian and Hispanic participants . Results : Correlational and logistic analyses indicated that belongingness support related most consistently to health outcomes . Ethnic subgroup analysis revealed similarities and differences in the pattern of associations among the predictor and outcome variables . Discussion : The results demonstrate the importance of social variables for predicting disease outcomes in the elderly and across ethnic groups Seniors are most vulnerable to conjugal bereavement . Although social support buffers the effects of bereavement , widows and widowers have lower levels of social support than married individuals . Self-help/support groups can supplement support from their depleted natural networks . Accordingly , the aim of this demonstration project was to examine the impact of support groups on widowed seniors ' loneliness , affect , and perceived support . Four face-to-face support groups for widowed seniors were conducted weekly for a maximum of 20 weeks . Participants completed pretest , posttest , and delayed posttest measures of support need and support satisfaction , positive and negative affect , and loneliness/isolation . The statistically significant impacts of the intervention were enhanced support satisfaction , diminished support needs , and increased positive affect . There was a trend toward decreased social isolation and emotional loneliness . In postintervention semistructured interviews , bereaved seniors reported increased hope , improved skills in developing social relationships , enhanced coping , new role identities , and less loneliness . Community health nurse research ers could conduct r and omized controlled trials of face-to-face and telephone support groups for bereaved people of all ages . Community health nurse practitioners could benefit from lessons learned about timing , duration , and selection of sensitive outcomes Background : Emotional loneliness and social isolation are major problems in old age . These concepts are interrelated and often used interchangeably , but few studies have investigated them simultaneously thus trying to clarify their relationship . Objectives : To describe the prevalence of loneliness among aged Finns and to study the relationship of loneliness with the frequency of social contacts , with older people ’s expectations and satisfaction of their human relationships . Especially , we wanted to clarify whether emotional loneliness is a separate concept from social isolation . Methods : The data were collected with a postal question naire . Background information , feelings of loneliness , number of friends , frequency of contacts with children , gr and children and friends , the expectations of frequency of contacts as well as satisfaction of the contacts were inquired . The question naire was sent to a r and om sample of 6,786 aged people ( > 74 years ) in various urban and rural areas in Finl and . We report here the results of community-dwelling respondents ( n = 4,113 ) . Main Results : More than one third of the respondents ( 39.4 % ) suffered from loneliness . Feeling of loneliness was not associated with the frequency of contacts with children and friends but rather with expectations and satisfaction of these contacts . The most powerful predictors of loneliness were living alone , depression , experienced poor underst and ing by the nearest , and unfulfilled expectations of contacts with friends . Conclusion : Our findings support the view that emotional loneliness is a separate concept from social isolation . This has implication s for practice . Interventions aim ing at relieving loneliness should be focused on enabling an individual to reflect her own expectations and inner feelings of loneliness BACKGROUND A r and omized controlled trial was conducted to examine : ( a ) the effect of two physical activity modes on changes in subjective well-being ( SWB ) over the course of a 12-month period in older , formerly sedentary adults ( N = 174 , M age = 65.5 years ) and ( b ) the role played by physical activity participation and social support in changes in SWB over time . METHOD Participants were r and omized into either an aerobic activity group or a stretching and toning group . Structural equation modeling was employed to conduct multiple sample latent growth curve analyses of individual growth in measures of SWB ( happiness , satisfaction with life , and loneliness ) over time . RESULTS A curvilinear growth pattern was revealed with well-being significantly improving over the course of the intervention followed by significant declines at the 6-month follow-up . Subsequent structural analyses were conducted showing that frequency of exercise participation was a significant predictor of improvement in satisfaction with life , whereas social relations were related to increases in satisfaction with life and reductions in loneliness . Improvements in social relations and exercise frequency also helped to buffer the declines in satisfaction with life at follow-up . CONCLUSIONS It appears that social relations integral to the exercise environment are significant determinants of subjective well-being in older adults . Findings are discussed in terms of how physical activity environments might be structured to maximize improvements in more global well-being constructs such as satisfaction with life The Internet ( electronic mail and the World Wide Web ) may provide new opportunities for communication that can help older adults avoid social isolation . This r and omized controlled trial assessed the psychosocial impact of providing Internet access to older adults over a five-month period . One hundred volunteers from four congregate housing sites and two nursing facilities were r and omly assigned to receive Internet training or to a wait list control group . The pre & post measures included the UCLA Loneliness scale , modified CES Depression scale , a measure of locus of control , computer attitudes , number of confidants , and overall quality of life . Participants received nine hours of small group training in six sessions over two weeks . Computers were available for continued use over five months and the trainer was available two hours/week for questions . At the end of the trial , 60 % of the intervention group continued to use the Internet on a weekly basis . Although there was a trend toward decreased loneliness and depression in intervention subjects compared to controls , there were no statistically significant changes from baseline to the end of trial between groups . Among Internet users ( n = 29 ) in the intervention group there were trends toward less loneliness , less depression , more positive attitudes toward computers , and more confidants than among intervention recipients who were not regular users ( n = 19 ) of this technology . Most elderly participants in this trial learned to use the Internet and the majority continued to use it on a weekly basis . The psychosocial impact of Internet use in this sample suggested trends in a positive direction . Further research is needed to determine more precisely , which older adults , residing in which environmental context s are more likely than others to benefit from this rapidly exp and ing information and communication link The quality of life of older adults may be improved by the use of computer or Web-based services . A limited number of experimental studies on this topic have shown mixed results . We carried out a r and omized , controlled intervention study that aim ed to examine the causal relationship between computer use and measures of physical well-being , social well-being , emotional well-being , development and activity , and autonomy . We r and omly assigned a group of 191 participants to an intervention group , a training-no intervention group , or a no training-no intervention group . A fourth group consisted of 45 participants with no interest in computer use . We collected data at baseline , after 4 months , and after 12 months . The results showed that using computers and the Internet neither positively nor negatively influenced everyday functioning , well-being and mood , and the social network of healthy older individuals . We discuss possibilities for future studies This study examined the effects on older adults ' psychological health of participation in a volunteer role that afforded opportunities to form friendships with age peers and to express nurturance toward another person . Access to these important social provisions was expected , in turn , to contribute to greater self-esteem , less loneliness , and less depression . The study hypotheses were tested by comparing older adults who served as foster gr and parents to a developmentally disabled child ( N = 52 ) with older adults in two comparison groups ( Ns = 69 , 59 ) . Three assessment s were conducted over a two-year period . The analyses revealed that the foster gr and parents exhibited a significant increase in the number of new ties formed , but participation in the Foster Gr and parent Program was not associated with the expected gains in emotional health . Explanations for the limited findings and implication s for future research are discussed BACKGROUND Loneliness among community-dwelling older people is a common problem , with serious health consequences . OBJECTIVES The favourable processes and mediating factors of a psychosocial group rehabilitation intervention in alleviating older people 's loneliness were evaluated . DESIGN Altogether , 117 lonely , home-dwelling individuals ( aged ≥75 years ) participated in a psychosocial group rehabilitation intervention . The content comprised ( i ) art and inspiring activities , ( ii ) group exercise and discussion s or ( iii ) therapeutic writing and group therapy . METHODS The psychosocial group rehabilitation intervention was evaluated from the group leaders ' diaries and by observing the groups . Experiences of loneliness and social participation were collected by postintervention question naires from the participants . Data were analysed using method ological triangulation . RESULTS Doing things together and sharing experiences with their peers inspired lively discussion s , created a feeling of togetherness and led to participants ' empowerment and increased self-esteem . The intervention socially activated the participants , and their feelings of loneliness had been alleviated during the intervention . CONCLUSION Several common favourable processes and mediating factors were identified in the psychosocial group rehabilitation intervention that led to alleviation of loneliness among older people . Relevance to clinical practice . The psychosocial group rehabilitation intervention gives nurses an effective tool to support older people 's psychosocial re sources by activating them and alleviating their loneliness The purpose of this study was to examine whether older age is associated with increasing loneliness in people aged 60 and over . Data came from TamELSA , a population -based prospect i ve longitudinal study in Tampere , Finl and . The follow-up time was 20 years . Loneliness was measured by a single question --"Do you feel lonely ?"-- with the possible answers often , sometimes , or never . Cross-sectional analysis showed that the percentage of subjects feeling lonely increased toward older age groups , but in a multivariate analysis , only household composition and social participation were independently associated with loneliness . Longitudinal analysis showed that loneliness increased with higher age . Over a 10-year period , loneliness increased most in those who , at baseline , were married and living alone with their spouse . In conclusion , only a minority of older people continuously suffer from loneliness . Loneliness does increase with age , not because of age per se , but because of increasing disability and decreasing social integration
11,016
28,856,498
Compared with the non-MDC group , MDC was associated with a lower risk of all-cause mortality and lower hospitalization rates for patients with CKD . In addition , MDC also result ed in a slower eGFR decline and reduced temporary catheterization for patients receiving dialysis . However , according to the subgroup analysis , the lower rates of all-cause mortality in the MDC group were observed only in patients in stage 4–5 and when the staff of the MDC consisted of nephrologists , nurse specialists and professionals from other fields . The most prominent effect of reducing the hospitalization rates was also observed in patients with stage 4–5 but not in patients with stage 4–5 CKD . Conclusions MDC can lower the all-cause mortality of patients with CKD , reduce temporary catheterization for patients receiving dialysis , decrease the hospitalization rate , and slow the eGFR decline . Moreover , the reduction in all-cause mortality crucially depends on the professionals comprising the MDC staff and the stage of CKD in patients . In addition , the CKD stage influences the hospitalization rates
Aim To assess the efficacy of the multidisciplinary care ( MDC ) model for patients with chronic kidney disease ( CKD ) . Background The MDC model has been used in clinical practice for years , but the effectiveness of the MDC model for patients with CKD remains controversial .
Treatment goals for patients with CKD are often unrealized for many reasons , but support by nurse practitioners may improve risk factor levels in these patients . Here , we analyzed renal endpoints of the Multifactorial Approach and Superior Treatment Efficacy in Renal Patients with the Aid of Nurse Practitioners ( MASTERPLAN ) study after extended follow-up to determine whether strict implementation of current CKD guidelines through the aid of nurse practitioners improves renal outcome . In total , 788 patients with moderate to severe CKD were r and omized to receive nurse practitioner support added to physician care ( intervention group ) or physician care alone ( control group ) . Median follow-up was 5.7 years . Renal outcome was a secondary endpoint of the MASTERPLAN study . We used a composite renal endpoint of death , ESRD , and 50 % increase in serum creatinine . Event rates were compared with adjustment for baseline serum creatinine concentration and changes in estimated GFR were determined . During the r and omized phase , there were small but significant differences between the groups in BP , proteinuria , LDL cholesterol , and use of aspirin , statins , active vitamin D , and antihypertensive medications , in favor of the intervention group . The intervention reduced the incidence of the composite renal endpoint by 20 % ( hazard ratio , 0.80 ; 95 % confidence interval , 0.66 to 0.98 ; P=0.03 ) . In the intervention group , the decrease in estimated GFR was 0.45 ml/min per 1.73 m(2 ) per year less than in the control group ( P=0.01 ) . In conclusion , additional support by nurse practitioners attenuated the decline of kidney function and improved renal outcome in patients with CKD PURPOSE Though case management has been recommended to improve the outcomes of patients with costly or morbid conditions , it has seldom been studied in controlled trials . We performed a r and omized , controlled clinical trial of an intensive , multidisciplinary case management program for patients with chronic renal insufficiency and followed patients for 5 years . PATIENTS AND METHODS We enrolled 437 primary -care patients ( 73 % of those eligible ) with chronic renal insufficiency ( estimated creatinine clearance consistently < 50 mL/min with the last serum creatinine level > 1.4 mg/dL ) who were attending an urban academic general internal medicine practice . The intensive case management , administered during the first 2 years after enrollment , consisted of m and atory repeated consultations in a nephrology case management clinic staffed by two nephrologists , a renal nurse , a renal dietitian , and a social worker . Control patients received usual care . Primary outcome measurements included serum creatinine level , estimated creatinine clearance , health services use , and mortality in the 5 years after enrollment . Secondary measures included use of renal sparing and potentially nephrotoxic drugs . RESULTS There were no differences in renal function , health services use , or mortality in the first , second , or third through fifth years after enrollment . There were significantly more outpatient visits among intervention patients , mainly because of the added visits to the nephrology case management clinic . There were also no significant differences in the use of renal sparing or selected potentially nephrotoxic drugs . The annual direct costs of the intervention were $ 89,355 ( $ 484 per intervention patient ) . CONCLUSION This intensive , multidisciplinary case-management intervention had no effect on the outcomes of care among primary -care patients with established chronic renal insufficiency . Such expensive and intrusive interventions , despite representing state-of-the-art care , should be tested prospect ively before being widely introduced into practice BACKGROUND AND OBJECTIVES It is unclear how to optimally care for chronic kidney disease ( CKD ) . This study compares a new coordinated model to usual care for CKD . DESIGN , SETTING , PARTICIPANTS , & MEASUREMENTS A r and omized trial in nephrology clinics and the community included 474 patients with median estimated GFR ( eGFR ) 42 ml/min per 1.73 m(2 ) identified by laboratory-based case finding compared care coordinated by a general practitioner ( controls ) with care by a nurse-coordinated team including a nephrologist ( intervention ) for a median ( interquartile range [ IQR ] ) of 742 days . 32 % were diabetic , 60 % had cardiovascular disease , and proteinuria was minimal . Guided by protocol s , the intervention team targeted risk factors for adverse kidney and cardiovascular outcomes . Serial eGFR and clinical events were tracked . RESULTS The average decline in eGFR over 20 months was -1.9 ml/min per 1.73 m(2 ) . eGFR declined by ≥4 ml/min per 1.73 m(2 ) within 20 months in 28 ( 17 % ) intervention patients versus 23 ( 13.9 % ) control patients . Control of BP , LDL , and diabetes were comparable across groups . In the intervention group there was a trend to greater use of renin-angiotensin blockers and more use of statins in those with initial LDL > 2.5 mmol/L. Treatment was rarely required for anemia , acidosis , or disordered mineral metabolism . Clinical events occurred in 5.2 % per year . CONCLUSIONS Patients with stage 3/4 CKD identified through community laboratories largely had nonprogressive kidney disease but had cardiovascular risk . Over a median of 24 months , the nurse-coordinated team did not affect rate of GFR decline or control of most risk factors compared with usual care Background The multidisciplinary pre-dialysis education ( MPE ) retards renal progression , reduce incidence of dialysis and mortality of CKD patients . However , the financial benefit of this intervention on patients starting hemodialysis has not yet been evaluated in prospect i ve and r and omized trial . Methods We studied the medical expenditure and utilization incurred in the first 6 months of dialysis initiation in 425 incident hemodialysis patients who were r and omized into MPE and non-MPE groups before reaching end-stage renal disease . The content of the MPE was st and ardized in accordance with the National Kidney Foundation Dialysis Outcomes Quality Initiative guidelines . Results The mean age of study patients was 63.8±13.2 years , and 221 ( 49.7 % ) of them were men . The mean serum creatinine level and estimated glomerular filtration rate was 6.1±4.0 mg/dL and 7.6±2.9 mL⋅min−1⋅1.73 m−2 , respectively , at dialysis initiation . MPE patients tended to have lower total medical cost in the first 6 months after hemodialysis initiation ( 9147.6±0.1 USD/patient vs. 11190.6±0.1 USD/patient , p = 0.003 ) , fewer in numbers [ 0 ( 1 ) vs. 1 ( 2 ) , p<0.001 ] and length of hospitalization [ 0 ( 15 ) vs. 8 ( 27 ) days , p<0.001 ] , and also lower inpatient cost [ 0 ( 2617.4 ) vs. 1559,4 ( 5019.6 ) USD/patient , p<0.001 ] than non-MPE patients , principally owing to reduced cardiovascular hospitalization and vascular access – related surgeries . The decreased inpatient and total medical cost associated with MPE were independent of patients ' demographic characteristics , concomitant disease , baseline biochemistry and use of double-lumen catheter at initiation of hemodialysis . Conclusions Participation of multidisciplinary education in pre-dialysis period was independently associated with reduction in the inpatient and total medical expenditures of the first 6 months post-dialysis owing to decreased inpatient service utilization secondary to cardiovascular causes and vascular access – related surgeries . Trial Registration Clinical Trials.gov Background . A well-functioning vascular access ( VA ) is essential to efficient dialysis therapy . Guidelines have been implemented improving care , yet access use varies widely across countries and VA complications remain a problem . This study took advantage of the unique opportunity to utilize data from the Dialysis Outcomes and Practice Patterns Study ( DOPPS ) to examine international trends in VA use and trends in patient characteristics and practice s associated with VA use from 1996 to 2007 . DOPPS is a prospect i ve , observational study of haemodialysis ( HD ) practice s and patient outcomes at > 300 HD units from 12 countries and has collected data thus far from > 35 000 r and omly selected patients . Methods . VA data were collected for each patient at study entry ( 1996–2007 ) . Practice pattern data from the facility medical director , nurse manager and VA surgeon were also analysed . Results . Since 2005 , a native arteriovenous fistula ( AVF ) was used by 67–91 % of prevalent patients in Japan , Italy , Germany , France , Spain , the UK , Australia and New Zeal and , and 50–59 % in Belgium , Sweden and Canada . From 1996 to 2007 , AVF use rose from 24 % to 47 % in the USA but declined in Italy , Germany and Spain . Moreover , graft use fell by 50 % in the USA from 58 % use in 1996 to 28 % by 2007 . Across three phases of data collection , patients consistently were less likely to use an AVF versus other VA types if female , of older age , having greater body mass index , diabetes , peripheral vascular disease or recurrent cellulitis/gangrene . In addition , countries with a greater prevalence of diabetes in HD patients had a significantly lower percentage of patients using an AVF . Despite poorer outcomes for central vein catheters , catheter use rose 1.5- to 3-fold among prevalent patients in many countries from 1996 to 2007 , even among non-diabetic patients 18–70 years old . Furthermore , 58–73 % of patients new to end-stage renal disease ( ESRD ) used a catheter for the initiation of HD in five countries despite 60–79 % of patients having been seen by a nephrologist > 4 months prior to ESRD . Patients were significantly ( P < 0.05 ) less likely to start dialysis with a permanent VA if treated in a faciity that ( 1 ) had a longer time from referral to access surgery evaluation or from evaluation to access creation and ( 2 ) had longer time from access creation until first AVF cannulation . The median time from referral until access creation varied from 5–6 days in Italy , Japan and Germany to 40–43 days in the UK and Canada . Compared to patients using an AVF , patients with a catheter displayed significantly lower mean Kt/V levels . Conclusions . Most countries meet the contemporary National Kidney Foundation 's Kidney Disease Outcomes Quality Initiative goal for AVF use ; however , there is still a wide variation in VA preference . Delays between the creation and cannulation must be improved to enhance the chances of a future permanent VA . Native arteriovenous fistula is the VA of choice ensuring dialysis adequacy and better patient outcomes . Graft is , however , a better alternative than catheter for patients where the creation of an attempted AVF failed or could not be created for different reasons A 1993 National Institutes of Health Consensus statement stressed the importance of early medical intervention in predialysis population s. Given the need for evidence -based practice , we report the outcomes of predialysis programs in two major Canadian cities . The purpose of this report was to determine whether the institution of a multidisciplinary predialysis program is of benefit to patients , and to analyze those factors that are important in actualizing those benefits . Data from two different studies is presented : ( 1 ) a prospect i ve , nonr and omized cohort study comparing patients who were or were not exposed to an ongoing multidisciplinary predialysis team ( St Paul 's Hospital ) and ( 2 ) a retrospective review of outcomes before and after the institution of a predialysis program ( The Toronto Hospital ) . Although created independently in major academic centers in Canada , the programs both aim ed to reduce urgent dialysis starts , improve preparedness for dialysis , and improve re source utilization . The Vancouver study was able to demonstrate significantly fewer urgent dialysis starts ( 13 % v 35 % ; P < 0.05 ) , more outpatient training ( 76 % v 43 % ; P < 0.05 ) , and less hospital days in the first month of dialysis ( 6.5 days v 13.5 days ; P < 0.05 ) . Cost savings of the program patients in 1993 are conservatively estimated to be $ 173,000 ( Canadian dollars ) or over $ 4,000 per patient . The Toronto study demonstrated success in predialysis access creation ( 86.3 % of patients ) , but could not realize any benefit in terms of elective dialysis initiation due to well-documented hemodialysis re source constraints . We conclude that an approach to predialysis patients involving a multidisciplinary team can have a positive impact on quantitative outcomes , but essential elements for success include ( 1 ) early referral to a nephrology center , ( 2 ) adequate re sources for dedicated predialysis program staff and infrastructure , and ( 3 ) available re sources for patients with end-stage renal disease ( ESRD ) ( dialysis stations ) . In times of economic constraints , objective data are necessary to justify re source -intensive proactive programs for patients with ESRD . Future studies should confirm and extend our observations so that optimum and cost-effective care for patients approaching ESRD is uniformly available OBJECTIVE Multifaceted care has been shown to reduce mortality and complications in type 2 diabetes . We hypothesized that structured care would reduce renal complications in type 2 diabetes . RESEARCH DESIGN AND METHODS A total of 205 Chinese type 2 diabetic patients from nine public hospitals who had plasma creatinine levels of 150–350 μmol/l were r and omly assigned to receive structured care ( n = 104 ) or usual care ( n = 101 ) for 2 years . The structured care group was managed according to a prespecified protocol with the following treatment goals : blood pressure < 130/80 mmHg , A1C < 7 % , LDL cholesterol < 2.6 mmol/l , triglyceride < 2 mmol/l , and persistent treatment with renin-angiotensin blockers . The primary end point was death and /or renal end point ( creatinine > 500 μmol/l or dialysis ) . RESULTS Of these 205 patients ( mean ± SD age 65 ± 7.2 years ; disease duration 14 ± 7.9 years ) , the structured care group achieved better control than the usual care group ( diastolic blood pressure 68 ± 12 vs. 71 ± 12 mmHg , respectively , P = 0.02 ; A1C 7.3 ± 1.3 vs. 8.0 ± 1.6 % , P < 0.01 ) . After adjustment for age , sex , and study sites , the structured care ( 23.1 % , n = 24 ) and usual care ( 23.8 % , n = 24 ; NS ) groups had similar end points , but more patients in the structured care group attained ≥3 treatment goals ( 61 % , n = 63 , vs. 28 % , n = 28 ; P < 0.001 ) . Patients who attained ≥3 treatment targets ( n = 91 ) had reduced risk of the primary end point ( 14 vs. 34 ; relative risk 0.43 [ 95 % CI 0.21–0.86 ] compared with that of those who attained ≤2 targets ( n = 114 ) . CONCLUSIONS Attainment of multiple treatment targets reduced the renal end point and death in type 2 diabetes . In addition to protocol , audits and feedback are needed to improve outcomes BACKGROUND The distribution of renal replacement therapy ( RRT ) modalities among patients varies from country to country , and is often influenced by non-medical factors . In our department , patients progressing towards end-stage renal disease ( ESRD ) go through a structured Pre-Dialysis Education Programme ( PDEP ) . The goals of the programme , based on both individualized information session(s ) given by an experienced nurse to the patient and family and the use of in-house audio-visual tapes , are to inform on all modalities of RRT , in order to decrease anxiety and promote self-care RRT modalities . METHODS To evaluate the influence of our PDEP on the choice of RRT modalities , we retrospectively review ed the modalities chosen by all consecutive patients starting a first RRT in our institution between December 1994 and March 2000 . RESULTS Two hundred and forty-two patients started a first RRT during the study period . Fifty-seven patients , median age 66 ( 24 - 80 ) years , were directed towards in-centre haemodialysis ( HD ) for medical or psycho-social reasons ( seven of whom were not involved in the PDEP ) ; the remaining 185 patients , median age 53 ( 7 - 81 ) years , with no major medical complications , went through our PDEP . Eight of them ( 4 % ) received a pre-emptive renal transplantation . The therapeutic options of the other 177 patients were as follows : 75 ( 40 % ) patients , median age 65 ( 20 - 81 ) years opted for in-centre HD , while 102 patients opted for a self-care modality ; 55 ( 31 % ) patients , median age 56 ( 7 - 77 ) years , chose peritoneal dialysis , 30 ( 16 % ) patients , median age 49 ( 21 - 68 ) years , chose to perform self-care HD in our satellite unit , and 17 ( 9 % ) patients , median age 46 ( 19 - 70 ) years , opted for home HD . Interestingly , in the whole cohort of patients , the cause of ESRD was associated with the RRT modality : the proportion of patients with chronic glomerulonephritis or chronic interstitial nephritis on self-care therapy was significantly higher than that of patients with nephrosclerosis , diabetic nephropathy or unknown cause of ESRD . CONCLUSION In our centre offering all treatment RRT modalities , a high percentage of patients exposed to a structured PDEP start with a self-care RRT modality . This leaves in-centre HD for patients needing medical and nursing care , or for patients refusing to participate in their treatment . Additional large studies , preferably with a r and omized design , should delineate the cost-benefit of such a PDEP on the final choice of a RRT modality The effectiveness of multidisciplinary care ( MDC ) in improving health outcomes for patients with chronic kidney disease ( CKD ) is uncertain . This study sought to determine the association among MDC , survival , and risk for hospitalization among elderly out patients with CKD . A total of 6978 patients who were 66 yr and older and had CKD were identified between July 1 and December 31 , 2001 , and followed to December 31 , 2004 ; 187 ( 2.7 % ) were followed in an MDC clinic . Logistic regression was used to determine the propensity score ( probability of MDC ) for each patient , and MDC and non-MDC patients then were matched 1:1 on the basis of their score . A Cox model was used to determine the association between MDC and risk for death and hospitalization . After adjustment for age , gender , baseline GFR , diabetes , and comorbidity score , there was a 50 % reduction in the risk for death for the MDC compared with the non-MDC group ( hazard ratio [ HR ] 0.50 ; 95 % confidence interval [ CI ] 0.35 to 0.71 ) . There was no difference in the risk for all-cause ( HR 0.83 ; 95 % CI 0.64 to 1.06 ) or cardiovascular-specific hospitalization ( HR 0.76 ; 95 % CI 0.54 to 1.06 ) for the MDC compared with the non-MDC group . In conclusion , it was found that MDC was associated with a significant reduction in the risk for all-cause mortality and , although not statistically significant , a trend toward a reduction in risk for all-cause and cardiovascular-specific hospitalizations . The benefits of MDC and an assessment of their economic impact should be tested in a r and omized , controlled trial BACKGROUND Consensus endorses predialysis intervention before the onset of end-stage renal disease . In a previous study , predialysis psychoeducational intervention ( PPI ) extended time to dialysis therapy by a median of 6 months . We undertook to replicate and extend this finding by examining hypothesized mechanisms . METHODS We used an inception-cohort , prospect i ve , r and omized , controlled trial with follow-up to evaluate an intervention that included an interactive 1-on-1 slide-supported educational session , a printed summary ( booklet ) , and supportive telephone calls once every 3 weeks . Participants were sample d from 15 Canadian ( tertiary care ) nephrology units and included 297 patients with progressive chronic kidney disease ( CKD ) expected to require renal replacement therapy ( RRT ) within 6 to 18 months . The main outcome was time to dialysis therapy ( censored at 18 months if still awaiting RRT ) . RESULTS Time to dialysis therapy was significantly longer ( median , 17.0 months ) for the PPI group than the usual-care control group ( median , 14.2 months ; Cox 's proportional hazards analysis , controlling for general nonrenal health , P < 0.001 ) . Coping by avoidance of threat-related information ( called blunting ) was associated with shorter times to dialysis therapy ( P < 0.032 ) . A group x blunting interaction ( P < 0.069 ) indicated : ( 1 ) time to dialysis therapy was shortened in the usual-care group , especially when patients coped by blunting ; but ( 2 ) time to dialysis therapy was extended with PPI , even among patients who coped by blunting . Knowledge acquisition predicted time to dialysis therapy ( r = 0.14 ; P < 0.013 ) . Time to dialysis therapy was unrelated to depression or social support . CONCLUSION PPI extends time to dialysis therapy in patients with progressive CKD . The mechanism may involve the acquisition and implementation of illness-related knowledge . Routine follow-up also may be especially important when patients cope by avoiding threat-related information BACKGROUND Previous studies have demonstrated that multidisciplinary pre-dialysis education and team care may slow the decline in renal function for chronic kidney disease ( CKD ) . Our study compared clinical outcomes of CKD patients between multidisciplinary care ( MDC ) and usual care in Taiwan . METHODS In this 3-year prospect i ve cohort study from 2008 to 2010 , we recruited 1056 CKD subjects , aged 20 - 80 years , from five hospitals , who received either MDC or usual care , had an estimated glomerular filtration rate ( eGFR ) < 60 mL/min , were matched one to one with the propensity score including gender , age , eGFR and co-morbidity diseases . The MDC team was under-cared based on NKF K/DOQI clinical practice guidelines and the Taiwanese pre-end-stage renal disease ( ESRD ) care program . The incidence of progression to ESRD ( initiation of dialysis ) and mortality was compared between two groups . We also monitored blood pressure control , the rate of renal function decline , lipid profile , hematocrit and mineral bone disease control . RESULTS Participants were prone to be male ( 64.8 % ) with a mean age of 65.1 years and 33.1 months of mean follow-up . The MDC group had higher prescription rates of angiotensin-converting enzyme inhibitor/angiotensin receptor blocker ( ACEI/ARB ) , phosphate binder , vitamin D3 , uric acid lower agents and erythropoietin-stimulating therapy and better control in secondary hyperparathyroidism . The decline of renal function in advanced stage CKD IV and V was also slower in the MDC group ( -5.1 versus -7.3 mL/min , P = 0.01 ) . The use of temporary dialysis catheter was higher in the usual care group , and CKD patients under MDC intervention exhibited a greater willingness to choose peritoneal dialysis modality . A Cox regression revealed that the MDC group was associated with a 40 % reduction in the risk of hospitalization due to infection , and a 51 % reduction in patient mortality , but a 68 % increase in the risk of initiation dialysis when compared with the usual care group . CONCLUSIONS MDC patients were found to have more effective medication prescription according to K/DOQI guidelines and slower renal function declines in advanced/late-stage CKD . After MDC intervention , CKD patients had a better survival rate and were more likely to initiate renal replacement therapy ( RRT ) instead of mortality BACKGROUND Observational studies have demonstrated that multidisciplinary predialysis education ( MPE ) improves the post-dialysis outcomes of chronic kidney disease ( CKD ) patients . However , the beneficial effect of MPE remains unclear in prospect i ve controlled studies . METHODS All CKD patients who visited the outpatient nephrology clinics at two centres of the Chang Gung Memorial Hospital in 2006 - 07 were enrolled . The incidence of dialysis and mortality were compared between MPE recipients and non-recipients . The content of the MPE was st and ardized in accordance with the NKF/DOQI guidelines . Prognostic factors for progression to end-stage renal disease ( ESRD ) and all-cause mortality were analysed by using the Cox proportional hazards model . RESULTS Of 573 patients , 287 received MPE . Dialysis was initiated in 13.9 % and 43 % of the patients in the MPE and non-MPE groups , respectively ( P < 0.001 ) . The mean follow-up period was 11.7 + /- 0.9 months . The overall mortality was 1.7 % and 10.1 % in the MPE and non-MPE groups , respectively ( P < 0.001 ) . Cox regression analysis revealed that diabetes , estimated glomerular filtration rate ( eGFR ) , high-sensitive C-reactive protein ( hs-CRP ) and MPE assignment were significant independent predictors for progression to ESRD . Independent prognostic factors for mortality included age , diabetes , eGFR , hs-CRP and MPE assignment . CONCLUSIONS MPE based on the NKF/DOQI guidelines may decrease the incidence of dialysis and reduce mortality in late-stage CKD patients Background : Although multidisciplinary congestive heart failure clinics in the United States appear to be effective in reducing the number of hospital readmissions , it is unclear whether the same benefit is seen in countries such as Canada , where access to both general and specialized medical care is free and unrestricted . We sought to determine the impact of care at a multidisciplinary specialized outpatient congestive heart failure clinic compared with st and ard care . Methods : We r and omly assigned 230 eligible patients who had experienced an acute episode of congestive heart failure to st and ard care ( n = 115 ) or follow-up at a multidisciplinary specialized heart failure outpatient clinic ( n = 115 ) . The intervention consisted of a structured outpatient clinic environment with complete access to cardiologists and allied health professionals . The primary outcomes were all-cause hospital admission rates and total number of days in hospital at 6 months . The secondary outcomes were total number of emergency department visits , quality of life and total mortality . Results : At 6 months , fewer patients in the intervention group had required readmission to hospital than patients in the control group ( 45 [ 39 % ] v. 66 [ 57 % ] , crude hazard ratio [ HR ] 0.59 , 95 % confidence interval [ CI ] 0.38–0.92 . Patients in the intervention group stayed in hospital for 514 days compared with 815 days required by patients in the control group ( adjusted HR 0.56 , 95 % CI 0.35–0.89 ) . The number of patients seen in the emergency department and the total number of emergency department visits were similar in the intervention and control groups . At 6 months , quality of life , which was self-assessed using the Minnesota Living with Heart Failure question naire , was unchanged in the control group but improved in the intervention group ( p < 0.001 ) . No difference in mortality was observed , with 19 deaths in the control group and 12 in the intervention group ( HR 0.61 , 95 % CI 0.24–1.54 ) . Interpretation : Compared with usual care , care at a multidisciplinary specialized congestive heart failure outpatient clinic reduced the number of hospital readmissions and hospital days and improved quality of life . When our results are integrated with those from other , similar trials , multidisciplinary disease management strategies for congestive heart failure are associated with clinical ly worthwhile improvements in survival
11,017
29,412,320
We did not detect effects of SPS on the outcomes treatment failure and death . These findings revealed that SPS might improve TB treatment outcomes in lower-middle-income economies or countries with high burden of this disease .
Tuberculosis ( TB ) is a poverty infectious disease that affects millions of people worldwide . Evidence s suggest that social protection strategies ( SPS ) can improve TB treatment outcomes . This study aim ed to synthesize such evidence s through systematic literature review and meta- analysis .
Background One of the main strategies to control tuberculosis ( TB ) is to find and treat people with active disease . Unfortunately , the case detection rates remain low in many countries . Thus , we need interventions to find and treat sufficient number of patients to control TB . We investigated whether involving health extension workers ( HEWs : trained community health workers ) in TB control improved smear-positive case detection and treatment success rates in southern Ethiopia . Methodology /Principal Finding We carried out a community-r and omized trial in southern Ethiopia from September 2006 to April 2008 . Fifty-one kebeles ( with a total population of 296 , 811 ) were r and omly allocated to intervention and control groups . We trained HEWs in the intervention kebeles on how to identify suspects , collect sputum , and provide directly observed treatment . The HEWs in the intervention kebeles advised people with productive cough of 2 weeks or more duration to attend the health posts . Two hundred and thirty smear-positive patients were identified from the intervention and 88 patients from the control kebeles . The mean case detection rate was higher in the intervention than in the control kebeles ( 122.2 % vs 69.4 % , p<0.001 ) . In addition , more females patients were identified in the intervention kebeles ( 149.0 vs 91.6 , p<0.001 ) . The mean treatment success rate was higher in the intervention than in the control kebeles ( 89.3 % vs 83.1 % , p = 0.012 ) and more for females patients ( 89.8 % vs 81.3 % , p = 0.05 ) . Conclusions / Significance The involvement of HEWs in sputum collection and treatment improved smear-positive case detection and treatment success rate , possibly because of an improved service access . This could be applied in setting s with low health service coverage and a shortage of health workers . Trial Registration Clinical Trials.gov Background Lay health workers ( LHWs ) play a pivotal role in addressing the high TB burden in Malawi . LHWs report lack of training to be a key barrier to their role as TB care providers . Given the cost of traditional off-site training , an alternative approach is needed . Our objective was to evaluate the effectiveness of a KT intervention tailored to LHWs needs . Methods The study design is a pragmatic cluster r and omized trial . The study was embedded within a larger trial , PALMPLUS , and compared three arms which included 28 health centers in Zomba district , Malawi . The control arm included 14 health centers r and omized as controls in the larger trial and maintained as control sites . Seven of 14 PALMPLUS intervention sites were r and omized to the LHW intervention ( PALM/LHW intervention arm ) , and the remaining 7 PALMPLUS sites maintained as a PALM only arm . PALMPLUS intervention sites received an educational outreach program targeting mid-level health workers . LHW intervention sites received both the PALMPLUS intervention and the LHW intervention employing on-site peer-led educational outreach and a point-of-care tool tailored to LHWs identified needs . Control sites received no intervention . The main outcome measure is the proportion of treatment successes . Results Among the 28 sites , there were 178 incident TB cases with 46/80 ( 0.58 ) successes in the control group , 44/68 ( 0.65 ) successes in the PALMPLUS group , and 21/30 ( 0.70 ) successes in the PALM/LHW intervention group . There was no significant effect of the intervention on treatment success in the univariate analysis adjusted for cluster r and omization ( p = 0.578 ) or multivariate analysis controlling for covariates with significant model effects ( p = 0.760 ) . The overall test of the intervention-arm by TB-type interaction approached but did not achieve significance ( p = 0.056 ) , with the interaction significant only in the control arm [ RR of treatment success for pulmonary TB relative to non-pulmonary TB , 1.18 , 95 % CI 1.05–1.31 ] . Conclusions We found no significant treatment effect of our intervention . Given the identified trend for effectiveness and urgent need for low-cost approaches to LHW training , further evaluation of tailored KT strategies as a means of LHW training in Malawi and other LMICs is warranted . Trial registration Clinical Trials.gov NCT01356095 OBJECTIVE To investigate the effects of nutritional supplementation on the outcome and nutritional status of south Indian patients with tuberculosis ( TB ) with and without human immunodeficiency virus ( HIV ) coinfection on anti-tuberculous therapy . METHOD R and omized controlled trial on the effect of a locally prepared cereal-lentil mixture providing 930 kcal and a multivitamin micronutrient supplement during anti-tuberculous therapy in 81 newly diagnosed TB alone and 22 TB-HIV-coinfected patients , among whom 51 received and 52 did not receive the supplement . The primary outcome evaluated at completion of TB therapy was outcome of TB treatment , as classified by the national programme . Secondary outcomes were body composition , compliance and condition on follow-up 1 year after cessation of TB therapy and supplementation . RESULTS There was no significant difference in TB outcomes at the end of treatment , but HIV-TB coinfected individuals had four times greater odds of poor outcome than those with TB alone . Among patients with TB , 1/35 ( 2.9 % ) supplemented and 5/42(12 % ) of those not supplemented had poor outcomes , while among TB-HIV-coinfected individuals , 4/13 ( 31 % ) supplemented and 3/7 ( 42.8 % ) non-supplemented patients had poor outcomes at the end of treatment , and the differences were more marked after 1 year of follow-up . Although there was some trend of benefit for both TB alone and TB-HIV coinfection , the results were not statistically significant at the end of TB treatment , possibly because of limited sample size . CONCLUSION Nutritional supplements in patients are a potentially feasible , low-cost intervention , which could impact patients with TB and TB-HIV . The public health importance of these diseases in re source -limited setting s suggests the need for large , multi-centre r and omized control trials on nutritional supplementation Background Poverty undermines adherence to tuberculosis treatment . Economic support may both encourage and enable patients to complete treatment . In South Africa , which carries a high burden of tuberculosis , such support may improve the currently poor outcomes of patients on tuberculosis treatment . The aim of this study was to test the feasibility and effectiveness of delivering economic support to patients with pulmonary tuberculosis in a high-burden province of South Africa . Methods This was a pragmatic , unblinded , two-arm cluster-r and omized controlled trial , where 20 public sector clinics acted as clusters . Patients with pulmonary tuberculosis in intervention clinics ( n = 2,107 ) were offered a monthly voucher of ZAR120.00 ( approximately US$ 15 ) until the completion of their treatment . Vouchers were redeemed at local shops for foodstuffs . Patients in control clinics ( n = 1,984 ) received usual tuberculosis care . Results Intention to treat analysis showed a small but non-significant improvement in treatment success rates in intervention clinics ( intervention 76.2 % ; control 70.7 % ; risk difference 5.6 % ( 95 % confidence interval : -1.2 % , 12.3 % ) , P = 0.107 ) . Low fidelity to the intervention meant that 36.2 % of eligible patients did not receive a voucher at all , 32.3 % received a voucher for between one and three months and 31.5 % received a voucher for four to eight months of treatment . There was a strong dose – response relationship between frequency of receipt of the voucher and treatment success ( P < 0.001 ) . Conclusions Our pragmatic trial has shown that , in the real world setting of public sector clinics in South Africa , economic support to patients with tuberculosis does not significantly improve outcomes on treatment . However , the low fidelity to the delivery of our voucher meant that a third of eligible patients did not receive it . Among patients in intervention clinics who received the voucher at least once , treatment success rates were significantly improved . Further operational research is needed to explore how best to ensure the consistent and appropriate delivery of such support to those eligible to receive it . Trial registration Current Controlled TrialsIS RCT SETTING Farms in the Bol and health district , Western Cape Province , South Africa . OBJECTIVE To evaluate the effect of lay health workers ( LHWs ) on tuberculosis ( TB ) control among permanent farm workers and farm dwellers in an area with particularly high TB prevalence . DESIGN Pragmatic , unblinded cluster r and omised control trial . METHODS This trial measured successful treatment completion rates among new smear-positive ( NSP ) adult TB patients on 106 intervention farms , and compared them with outcomes in patients on 105 control farms . Farms were the unit of r and omisation , and analysis was by intention to treat . RESULTS A total of 164 adult TB patients were recruited into the study , 89 of whom were NSP . The successful treatment completion rate in NSP adult TB patients was 18.7 % higher ( P = 0.042 , 95%CI 0.9 - 36.4 ) on farms in the intervention group than on farms in the control group . Case finding for adult NSP TB cases was 8 % higher ( P = 0.2671 ) on farms in the intervention group compared to the control group . CONCLUSION Trained LHWs were able to improve the successful TB treatment rate among adult NSP TB patients in a well-established health service , despite reduction of services Objective To determine the effectiveness of the provision of whole food to enhance completion of treatment for tuberculosis . Design Parallel group r and omised controlled trial . Setting Three primary care clinics in Dili , Timor-Leste . Participants 270 adults aged ≥18 with previously untreated newly diagnosed pulmonary tuberculosis . Main outcome measures Completion of treatment ( including cure ) . Secondary outcomes included adherence to treatment , weight gain , and clearance of sputum smears . Outcomes were assessed remotely , blinded to allocation status . Interventions Participants started st and ard tuberculosis treatment and were r and omly assigned to intervention ( nutritious , culturally appropriate daily meal ( weeks 1 - 8 ) and food package ( weeks 9 - 32 ) ( n=137 ) or control ( nutritional advice , n=133 ) groups . R and omisation sequence was computer generated with allocation concealment by sequentially numbered , opaque , sealed envelopes . Results Most patients with tuberculosis were poor , malnourished men living close to the clinics ; 265/270 ( 98 % ) contributed to the analysis . The intervention had no significant beneficial or harmful impact on the outcome of treatment ( 76 % v 78 % completion , P=0.7 ) or adherence ( 93 % for both groups , P=0.7 ) but did lead to improved weight gain at the end of treatment ( 10.1 % v 7.5 % improvement , P=0.04 ) . Itch was more common in the intervention group ( 21 % v 9 % , P<0.01 ) . In a subgroup analysis of patients with positive results on sputum smears , there were clinical ly important improvements in one month sputum clearance ( 85 % v 67 % , P=0.13 ) and completion of treatment ( 78 % v 68 % , P=0.3 ) . Conclusion Provision of food did not improve outcomes with tuberculosis treatment in these patients in Timor-Leste . Further studies in different setting s and measuring different outcomes are required . Trial registration Clinical Trials NCT0019256 OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity SETTING Study conducted in a suburb of Cape Town , South Africa . OBJECTIVE Comparison of successful tuberculosis treatment outcome rates between self supervision , supervision by lay health worker ( LHW ) , and supervision by clinic nurse . METHODS Open , r and omised , controlled trial with intention-to-treat analysis . RESULTS All groups ( n = 156 ) achieved similar outcomes ( LHW vs. clinic nurse : risk difference 17.2 % , 95 % confidence interval [ CI ] -0.1 - 34.5 ; LHW vs. self supervision 15 % , 95%CI -3.7 - 33.6 ) . New patients benefit from LHW supervision ( LHW vs clinic nurse : risk difference 24.2 % , 95%CI 6 - 42.5 , LHW vs. self supervision 39.1 % , 95%CI 17.8 - 60.3 ) as do female patients ( LHW vs. clinic nurse 48.3 % , 95%CI 22.8 - 73.8 , LHW vs. self supervision 32.6 % , 95%CI 6.4 - 58.7 ) . CONCLUSIONS LHW supervision approaches statistically significant superiority , but fails to reach it most likely due to the study 's limitation , the small sample size . It is possible that subgroups ( new and female patients ) do well under LHW supervision . LHW supervision could be offered as one of several supervision options within TB control programmes
11,018
22,261,258
Fluoride mouthrinses have simpler formulations and can have better oral fluoride retention profiles than fluoride toothpastes , depending on post-brushing rinsing behaviors .
The caries-preventive benefits of fluoride are generally accepted by dental research ers and practicing professionals worldwide . The benefits of fluoride toothpastes and mouthrinses have been supported by several high- quality systematic review s. The formulation of a fluoride toothpaste and biological ( salivary flow rate ) and behavioral factors ( brushing frequency , brushing time , post-brushing rinsing practice s , timing of brushing , and amount of toothpaste applied ) can influence anticaries efficacy .
In a recent clinical trial of sodium monofluorophosphate dentifrices , oral rinsing habits were found to influence dental caries . Thus an oral fluoride clearance study has been undertaken which was design ed to test a possible mechanism for the observed effects . Eight subjects brushed with one of the trial dentifrices and then rinsed using 1 of 8 procedures of varying thoroughness . The salivary fluoride concentration measured 5 min after dentifrice application decreased significantly with increasing rinse volume , rinse duration , and rinse frequency ( p less than 0.01 , analysis of variance ) . The area under the clearance curve determined over a further 3 h was significantly higher ( 300 % ; p less than 0.01 ) following use of the least thorough rinsing procedure ( 5 ml x 2 s once ) as compared with the corresponding area under the clearance curve following the most thorough procedure ( 20 ml x 10 s twice ) . These findings indicate that rinsing habits may play an important role in the oral retention of fluoride from dentifrices which may , in turn , affect their clinical efficacy The aim of the study was to determine whether rinsing with a mouthwash after brushing with a fluori date d toothpaste affected oral fluoride ( F ) retention and clearance compared with an oral hygiene regime without mouthwash . In this supervised , single-blind study , 3 regimes were compared : ( A ) brushing for 1 min with 1 g of 1,450 μg F/g NaF toothpaste followed by rinsing for 5 s with 10 ml water ; ( B ) as A but followed by rinsing for 30 s with 20 ml of 100 mg F/l NaF mouthwash , and ( C ) as B but rinsing for 30 s with a non-fluori date d mouthwash . Twenty-three adults applied each treatment once in a r and omised order , separated by 1-week washout periods , and used a non-fluori date d toothpaste at home prior to and during the study . Whole saliva sample s ( 2 ml ) , collected before each treatment commenced and 10 , 20 , 30 , 60 , 90 and 120 min afterwards , were subsequently analysed for fluoride by ion-specific electrode . The mean ( SD ) back-transformed log ( area under salivary F clearance curve ) values were : A = 2.36 ( + 3.37 , –1.39 ) , B = 2.54 ( + 2.72 , –1.31 ) and C = 1.19 ( + 1.10 , –0.57 ) mmol F/l × min , respectively . The values for regimes A and B were statistically significantly greater than that for regime C ( p < 0.001 ; paired t test ) . These findings suggest that use of a non-F mouthwash after toothbrushing with a F toothpaste may reduce the anticaries protection provided by toothbrushing with a F toothpaste alone . The use of a mouthwash with at least 100 mg F/l should minimise this risk The preventive effect of a modified fluoride ( F ) toothpaste technique was investigated . Both the incidence and progression of approximal caries among Saudi adults with a high caries prevalence were evaluated after 2 years . A total of 175 adults were r and omly assigned to a test group and a control group and 106 completed the study . In the test group ( n = 54 ) , the patients were asked to use the provided F toothpaste twice a day . They were instructed to use the " modified F toothpaste technique " as follows : ( 1 ) to use 2 cm of the toothpaste , ( 2 ) to brush for 2 min , ( 3 ) to swish the toothpaste slurry around the dentition with active movements of the cheeks , lips and tongue , forcing the slurry into the approximal area for about half a minute , before spitting it out , and ( 4 ) no post-brushing rinsing and no eating/drinking for 2 hr . The patients in the control group ( n = 52 ) were instructed to continue using their regular F toothpaste twice a day without any further instructions . Approximal caries and filled surfaces were scored on bitewing radiographs at baseline and after 2 years . The mean ( SD ) total caries incidence for the test and control group was 1.15 ( 1.49 ) and 3.37 ( 2.57 ) respectively ( p < 0.001 ) . In general , the control group displayed a higher progression rate than the test group with regard to enamel lesions to dentine ( NS ) , enamel lesions to filled surfaces ( p < 0.05 ) and filled surfaces that had recurrent caries ( NS ) . To conclude , the " modified fluoride toothpaste technique " , as practised over the 2 years in a Saudi population with a high caries prevalence , had a caries preventive effect on the incidence of approximal caries , but not on the progression This study compared the ability of two sodium fluoride dentifrices , one containing 5,000 ppm fluoride ( Prevident 5000 Plus ) and the other 1,100 ppm fluoride ( Winterfresh Gel ) , to reverse primary root caries lesions ( PRCLs ) . A total of 201 subjects with at least one PRCL each entered the study and were r and omly allocated to use one of the dentifrices . After 6 months , 186 subjects were included in statistical analyses . At baseline and after 3 and 6 months , the lesions were clinical ly assessed and their electrical resistance measured using an electrical caries monitor . After 3 months , 39 ( 38.2 % ) of the 102 subjects in the 5,000 ppm F– group and 9 ( 10.7 % ) of 84 subjects using the 1,100 ppm F– dentifrice , had one or more PRCLs which had hardened ( p = 0.005 ) . Between baseline and 3 months , the log10 mean ± SD resistance values of lesions for subjects in the 1,100 ppm F– group had decreased by 0.06±0.55 , whereas those in the 5,000 ppm F– group had increased by 0.40±0.64 ( p<0.001 ) . After 6 months , 58 ( 56.9 % ) of the subjects in the 5,000 ppm F– group and 24 ( 28.6 % ) in the 1,100 ppm F– group had one or more PRCLs that had become hard ( p = 0.002 ) . Between baseline and 6 months , the log10 mean ± SD resistance values of lesions for subjects in the 1,100 ppm F– group decreased by 0.004±0.70 , whereas in the 5,000 ppm F– group , they increased by 0.56±0.76 ( p<0.001 ) . After 3 and 6 months , the distance from the apical border of the root caries lesions to the gingival margin increased significantly in the 5,000 ppm F– group when compared with the 1,100 ppm F– group . The plaque index in the 5,000 ppm F– group was also significantly reduced when compared with the 1,100 ppm F– group . The colour of the lesions remained unchanged . It was concluded that the dentifrice containing 5,000 ppm F– was significantly better at remineralising PRCLs than the one containing 1,100 ppm It is now well-accepted that the primary anti-caries activity of fluoride ( F ) is via topical action . The retention of F in the mouth after topical fluoride treatment is considered to be an important factor in the clinical efficacy of F. The purpose of this study was to evaluate F levels in ductal saliva , whole saliva , and pooled plaque after treatment with topical F agents intended for home use . Ten consenting adults , mean ( SD ) age 31.0 ( 8.2 ) years , participated in all aspects of the study . Two days before each test , subjects received a professional tooth cleaning and subsequently abstained from all oral hygiene procedures to permit plaque to accumulate , and from the use of F-containing dental products . Treatments consisted of a placebo dentifrice ( PD ) , fluoride dentifrice ( FD ; 0.24 % NaF ) , fluoride rinse ( FR ; 0.05 % NaF ) , and fluoride gel ( FG ; 1.1 % NaF ) . Unstimulated whole saliva and pooled plaque were sample d at multiple points over a 24-hour period . In a separate experimental series , stimulated parotid saliva was sample d over a two-hour period after treatment . Fluoride levels generally followed the same pattern in whole saliva and pooled plaque sample s , with FG > FR > FD > PD . Night-time F application result ed in prolonged F retention in whole saliva but not in plaque . Fluoride levels in parotid saliva were only slightly higher after F treatment and returned to baseline levels within two h. The results of this study indicate that the method of F delivery , the F concentration of the agent , and the time of application ( daytime vs. night-time ) are important factors influencing F levels in the mouth . The recycling of F in ductal saliva as a consequence of the inadvertent ingestion of home-use fluoride products does not appear to make a clinical ly significant contribution to F levels in the mouth Recent evidence has suggested that the cariostatic effects of topical fluoride ( F ) are related to the presence of low concentrations of ionic F in the oral environment . The purpose of this study was to compare the retention of F in the oral environment over 24-hour periods after the use of a F dentifrice or a F rinse . Groups of ten consenting adult subjects ( age 18 - 52 years ) brushed and /or rinsed ( B/R ) in a st and ardized manner twice per day in the morning ( AM ) and before bed ( PM ) with either a placebo dentifrice ( 8 ppm F ) , NaF dentifrice ( 1100 ppm F ) , or NaF rinse ( 225 ppm F ) . Experiments were performed with placebo dentifrice only ( PD ) ; F dentifrice only ( FD ) ; F dentifizce followed by F rinse ( FD/FR ) ; placebo dentifrice followed by F rinse ( PD/FR ) ; and F rinse followed by placebo dentifrice ( FR/PD ) . Unstimulated whole saliva sample s were collected at baseline and then at 0 , 15 , 30 , and 45 min , 1 , 2 , and 8 hr after B/R in the AM , after B/R in the PM and upon rising the following morning . Salivary flow rate and F were determined for each sampling interval . The results of this study suggest that : ( 1 ) F rinse may be a more effective way of delivering topical F than F dentifrice ; ( 2 ) based on F retention , the combination of FD/FR was not more effective than FR only ( PD/FR ) ; ( 3 ) older individuals with gingival recession retained higher F levels ; and ( 4 ) bedtime F application result ed in longer F retention than did daytime application , which may have important implication s for enamel remineralization A total of 26 healthy volunteers participated in this r and omized 4-leg crossover study design ed to measure fluoride ( F ) retention in interdental plaque and saliva . Two NaF dentifrices ( 5,000 and 1,450 ppm F ) were used , with and without postbrushing water rinsing . The 4 tooth brushing methods were carried out twice a day during 2 weeks . Interdental plaque was collected from all proximal sites after each method , using dental floss . Immediately after the plaque sampling , the subjects were asked to brush their teeth with the same toothpaste and use the postbrushing water rinsing procedure as previously . Proximal saliva was collected from 4 interdental sites , using small paper points , before and up to 60 min after the brushing . The present study showed that the 5,000 ppm F toothpaste without postbrushing water rinsing result ed in the highest F concentration in both plaque and saliva and the 1,450 ppm F toothpaste with water rinsing in the lowest . The difference in the area under the curve of saliva F concentration versus time between the 2 methods was 4.2 times ( p < 0.001 ) . The corresponding difference in F concentration per unit weight of plaque ( n = 16 ) was 2.75 times ( p < 0.05 ) . Water rinsing immediately after tooth brushing with 5,000 ppm reduced the F concentration in saliva by 2.4 times ( p < 0.001 ) . The difference in F values in saliva between 5,000/rinsing and 1,450/no rinsing was minor and not significant . The increase of F in both proximal saliva and plaque , using a dentifrice with 5,000 ppm F without postbrushing water rinsing , may be of clinical importance OBJECTIVES To develop a st and ardized method for measuring the variables affecting fluoride ingestion from toothpaste in young children between the ages of 1.5 and 3.5 years , and to use the method at seven European sites . METHODS R and om sample s of children were invited to take part in the study . Parents who gave consent were visited at home . The children brushed their teeth using the toothpaste br and and toothbrush type currently in use . The difference between the fluoride dispensed onto the toothbrush and the fluoride recovered after accounting for losses was deemed to be the fluoride ingested . Details of other oral health-care habits were collected by question naire . For each child , the fluoride concentration of the toothpaste used was measured in the laboratory , from which an estimate of total daily fluoride ingestion was made . RESULTS There was considerable variation between countries in the types of toothpaste used and in the amounts of toothpaste applied and ingested . The amount of fluoride ingested ranged from 0.01 to 0.04 mg fluoride per kg of body weight per day . CONCLUSION The amount of fluoride ingested that is likely to be a risk factor for the development of dental fluorosis during tooth formation is equivocal and was found to vary widely between European countries . There appears to be a need for clearer health messages regarding the use of fluori date d toothpaste by young children
11,019
25,933,004
In comparison with both InSTI-based and CCR5-based therapy , efavirenz-based treatment was associated with a higher risk of therapy discontinuation due to adverse events . Results of our meta- analysis support the present clinical guidelines for antiretroviral-naive , HIV-infected patients , in which efavirenz is one of the most preferred regimens in the analyzed population .
Efavirenz , a non-nucleoside reverse-transcriptase inhibitor ( NNRTI ) is one of the most commonly prescribed antiretroviral drugs . The present article provides a systematic overview and meta- analysis of clinical trials comparing efavirenz and other active drugs currently recommended for treatment of HIV-infected , antiretroviral-naive patients .
Objective : To assess the safety and efficacy of two , single-tablet regimens for the initial treatment of HIV infection . Design : Phase 2 , r and omized , double-blind , double-dummy , multicenter , active-controlled study . Methods : Antiretroviral treatment-naive adults with a screening HIV-1 RNA at least 5000 copies/ml and a CD4 cell count more than 50 cells/μl were r and omized 2 : 1 to receive fixed-dose combination tablets of elvitegravir/cobicistat/emtricitabine/tenofovir disoproxil fumarate ( EVG/COBI/FTC/TDF ; N = 48 ) or efavirenz/emtricitabine/tenofovir disoproxil fumarate ( EFV/FTC/TDF ; n = 23 ) for 48 weeks . The primary endpoint was proportion of participants with HIV-1 RNA less than 50 copies/ml at week 24 . Results : Participants receiving EVG/COBI/FTC/TDF exhibited a more rapid decline in HIV-1 RNA and a greater proportion suppressed viral load to less than 50 copies/ml than participants receiving EFV/FTC/TDF . Both EVG/COBI/FTC/TDF and EFV/FTC/TDF result ed in high rates of viral suppression and increases in CD4 cell count . Ninety and 83 % of participants suppressed HIV-1 RNA to less than 50 copies/ml both at the 24-week and 48-week visits for EVG/COBI/FTC/TDF and EFV/FTC/TDF , respectively . Once-daily administration of EVG/COBI/FTC/TDF provided a mean EVG trough concentration 10-fold over its protein binding-adjusted IC95 across study visits . EVG/FTC/TDF/GS-9350 was generally well tolerated with a lower rate of drug-related central nervous system ( 17 % ) and psychiatric ( 10 % ) adverse events versus EFV/FTC/TDF ( 26 and 44 % , respectively ) . Decreases in estimated glomerular filtration rate occurred within the first few weeks of dosing in participants receiving EVG/COBI/FTC/TDF , remained within the normal range and did not progress at week 24 or 48 ; no participant experienced a clinical adverse event or discontinued study drug due to changes in serum creatinine or renal function . Conclusion : Once-daily EVG/COBI/FTC/TDF achieved and maintained a high rate of virologic suppression with fewer central nervous system and psychiatric adverse events compared to a current st and ard-of-care regimen of EFV/FTC/TDF Background Rates of cardiovascular disease are higher among HIV-infected patients as a result of the complex interplay between traditional risk factors , HIV-related inflammatory and immunologic changes , and effects of antiretroviral therapy ( ART ) . This study prospect ively evaluated changes in cardiovascular biomarkers in an underrepresented , racially diverse , HIV-1-infected population receiving abacavir/lamivudine as backbone therapy . Methods This 96-week , open-label , r and omized , multicenter study compared once-daily fosamprenavir/ritonavir 1400/100 mg and efavirenz 600 mg , both with ABC/3TC 600 mg/300 mg , in antiretroviral-naïve , HLA-B*5701-negative adults without major resistance mutations to study drugs . We evaluated changes from baseline to weeks 4 , 12 , 24 , 48 , and 96 in interleukin-6 ( IL-6 ) , high-sensitivity C-reactive protein ( hs-CRP ) , soluble vascular adhesion molecule-1 ( sVCAM-1 ) , d-dimer , plasminogen , and fibrinogen . Biomarker data were log-transformed before analysis , and changes from baseline were described using geometric mean ratios . Results This study enrolled 101 patients ( 51 receiving fosamprenavir/ritonavir ; 50 receiving efavirenz ) : 32 % female , 60 % African American , and 38 % Hispanic/Latino ; 66 % ( 67/101 ) completed 96 weeks on study . At week 96 , levels of IL-6 , sVCAM-1 , d-dimer , fibrinogen , and plasminogen were lower than baseline in both treatment groups , and the decrease was statistically significant for sVCAM-1 ( fosamprenavir/ritonavir and efavirenz ) , d-dimer ( fosamprenavir/ritonavir and efavirenz ) , fibrinogen ( efavirenz ) , and plasminogen ( efavirenz ) . Values of hs-CRP varied over time in both groups , with a significant increase over baseline at Weeks 4 and 24 in the efavirenz group . At week 96 , there was no difference between the groups in the percentage of patients with HIV-1 RNA < 50 copies/mL ( fosamprenavir/ritonavir 63 % ; efavirenz 66 % ) by ITT missing-equals-failure analysis . Treatment-related grade 2–4 adverse events were more common with efavirenz ( 32 % ) compared with fosamprenavir/ritonavir ( 20 % ) , and median lipid concentrations increased in both groups over 96 weeks of treatment . Conclusions In this study of underrepresented patients , treatment with abacavir/lamivudine combined with either fosamprenavir/ritonavir or efavirenz over 96 weeks , produced stable or declining biomarker levels except for hs-CRP , including significant and favorable decreases in thrombotic activity ( reflected by d-dimer ) and endothelial activation ( reflected by sVCAM-1 ) . Our study adds to the emerging data that some cardiovascular biomarkers are decreased with initiation of ART and control of HIV viremia . Trial registration Clinical Trials.gov identifier Initial viral decay rate may be useful when comparing the relative potency of antiretroviral regimens . Two hundred twenty-seven ART-naïve patients were r and omized to receive efavirenz ( EFV ) ( n = 74 ) , lopinavir/ritonavir ( LPV/r ) ( n = 77 ) , or atazanavir/ritonavir ( ATV/r ) ( n = 79 ) in combination with two NRTIs . The most frequently used NRTI combinations in the EFV and ATV/r groups were the nonthymidine analogues tenofovir and emtricitabine or lamivudine ( 70 % and 68 % , respectively ) and , in the LPV/r group , lamivudine and the thymidine analogue zidovudine ( 89 % ) . HIV-1 RNA was monitored during the first 28 days after treatment initiation . Phase 1 and 2 decay rate was estimated in a subset of 157 patients by RNA decrease from days 0 to 7 , and days 14 to 28 . One-way ANOVA and subsequent Tukey 's post hoc tests were used for groupwise comparisons . Mean ( 95 % CI ) HIV-1 RNA reductions from days 0 to 28 were 2.59 ( 2.45 - 2.73 ) , 2.42 ( 2.27 - 2.57 ) , and 2.13 ( 2.01 - 2.25 ) log(10 ) copies/ml for the EFV- , LPV/r- , and ATV/r-based treatment groups , respectively , with a significantly larger decrease in the EFV-based group at all time points compared with ATV/r ( p < 0.0001 ) , and with LPV/r at days 7 - 21 ( p < 0.0001 - 0.03 ) . LPV/r gave a greater RNA decrease compared with ATV/r from day 14 ( p = 0.02 ) . Phase 1 decay rate was significantly higher in the EFV group compared with LPV/r ( p = 0.003 ) or ATV/r ( p < 0.0001 ) . No difference was found in phase 2 decrease . EFV-based treatment gave a more rapid decline in HIV-1 RNA than did either of the boosted protease inhibitor-based regimens . The observed differences may reflect different inherent regimen potencies Background : Raltegravir is an HIV-1 integrase str and -transfer inhibitor with potent in vitro activity . This study explored the antiretroviral activity and safety of raltegravir in treatment-naive patients with plasma HIV-1 RNA levels ≥5000 copies/mL and CD4 + T-cell counts ≥100 cells/mm3 . Methods : Multicenter , double-blind , r and omized , controlled study of raltegravir at doses of 100 , 200 , 400 , and 600 mg twice daily versus efavirenz at a dose of 600 mg/d , all in combination with tenofovir at a dose of 300 mg/d and lamivudine at a dose of 300 mg/d ( clinical trials.gov identifier : NCT00100048 ) . Results : In the 198 patients treated ( 160 on raltegravir and 38 on efavirenz ) , the mean HIV-1 RNA level ranged from 4.6 to 4.8 log10 copies/mL at baseline . At weeks 2 , 4 , and 8 , the proportion of patients achieving an HIV-1 RNA level < 50 copies/mL was greater in each of the raltegravir treatment groups than in the efavirenz group . By week 24 , all treatment groups appeared similar , with plasma HIV-1 RNA levels < 400 copies/mL in 85 % to 98 % of patients and < 50 copies/mL in 85 % to 95 % of patients . These reductions were maintained through week 48 in 85 % to 98 % of patients and in 83 % to 88 % of patients , respectively . Five ( 3 % ) patients on raltegravir and 1 ( 3 % ) on efavirenz experienced virologic failure before week 48 . Drug-related clinical adverse events were less common with raltegravir than with efavirenz . After 24 and 48 weeks of treatment , raltegravir did not result in increased serum levels of total cholesterol , low-density lipoprotein cholesterol , or triglycerides . Conclusions : Raltegravir at all doses studied was generally well tolerated in combination with tenofovir and lamivudine . Raltegravir exhibited potent and durable antiretroviral activity similar to that of efavirenz at 24 and 48 weeks but achieved HIV-1 RNA levels below detection at a more rapid rate BACKGROUND The MERIT ( Maraviroc versus Efavirenz in Treatment-Naive Patients ) study compared maraviroc and efavirenz , both with zidovudine-lamivudine , in antiretroviral-naive patients with R5 human immunodeficiency virus type 1 ( HIV-1 ) infection . METHODS Patients screened for R5 HIV-1 were r and omized to receive efavirenz ( 600 mg once daily ) or maraviroc ( 300 mg once or twice daily ) with zidovudine-lamivudine . Co primary end points were proportions of patients with a viral load < 400 and < 50 copies/mL at week 48 ; the noninferiority of maraviroc was assessed . RESULTS The once-daily maraviroc arm was discontinued for not meeting prespecified noninferiority criteria . In the primary 48-week analysis ( n = 721 ) , maraviroc was noninferior for < 400 copies/mL ( 70.6 % for maraviroc vs 73.1 % for efavirenz ) but not for < 50 copies/mL ( 65.3 % vs 69.3 % ) at a threshold of -10 % . More maraviroc patients discontinued for lack of efficacy ( 11.9 % vs 4.2 % ) , but fewer discontinued for adverse events ( 4.2 % vs 13.6 % ) . In a post hoc re analysis excluding 107 patients ( 15 % ) with non-R5 screening virus by the current , more sensitive tropism assay , the lower bound of the 1-sided 97.5 % confidence interval for the difference between treatment groups was above -10 % for each end point . CONCLUSIONS Twice-daily maraviroc was not noninferior to efavirenz at < 50 copies/mL in the primary analysis . However , 15 % of patients would have been ineligible for inclusion by a more sensitive screening assay . Their retrospective exclusion result ed in similar response rates in both arms Trial registration . Clinical Trials.gov identifier : ( NCT00098293 ) BACKGROUND Efavirenz with tenofovir-disoproxil-fumarate and emtricitabine is a preferred antiretroviral regimen for treatment-naive patients infected with HIV-1 . Rilpivirine , a new non-nucleoside reverse transcriptase inhibitor , has shown similar antiviral efficacy to efavirenz in a phase 2b trial with two nucleoside/nucleotide reverse transcriptase inhibitors . We aim ed to assess the efficacy , safety , and tolerability of rilpivirine versus efavirenz , each combined with tenofovir-disoproxil-fumarate and emtricitabine . METHODS We did a phase 3 , r and omised , double-blind , double-dummy , active-controlled trial , in patients infected with HIV-1 who were treatment-naive . The patients were aged 18 years or older with a plasma viral load at screening of 5000 copies per mL or greater , and viral sensitivity to all study drugs . Our trial was done at 112 sites across 21 countries . Patients were r and omly assigned by a computer-generated interactive web response system to receive either once-daily 25 mg rilpivirine or once-daily 600 mg efavirenz , each with tenofovir-disoproxil-fumarate and emtricitabine . Our primary objective was to show non-inferiority ( 12 % margin ) of rilpivirine to efavirenz in terms of the percentage of patients with confirmed response ( viral load < 50 copies per mL intention-to-treat time-to-loss-of-virological-response [ ITT-TLOVR ] algorithm ) at week 48 . Our primary analysis was by intention-to-treat . We also used logistic regression to adjust for baseline viral load . This trial is registered with Clinical Trials.gov , number NCT00540449 . FINDINGS 346 patients were r and omly assigned to receive rilpivirine and 344 to receive efavirenz and received at least one dose of study drug , with 287 ( 83 % ) and 285 ( 83 % ) in the respective groups having a confirmed response at week 48 . The point estimate from a logistic regression model for the percentage difference in response was -0.4 ( 95 % CI -5.9 to 5.2 ) , confirming non-inferiority with a 12 % margin ( primary endpoint ) . The incidence of virological failures was 13 % ( rilpivirine ) versus 6 % ( efavirenz ; 11%vs 4 % by ITT-TLOVR ) . Grade 2 - 4 adverse events ( 55 [ 16 % ] on rilpivirine vs 108 [ 31 % ] on efavirenz , p<0.0001 ) , discontinuations due to adverse events ( eight [ 2 % ] on rilpivirine vs 27 [ 8 % ] on efavirenz ) , rash , dizziness , and abnormal dreams or nightmares were more common with efavirenz . Increases in plasma lipids were significantly lower with rilpivirine . INTERPRETATION Rilpivirine showed non-inferior efficacy compared with efavirenz , with a higher virological-failure rate , but a more favourable safety and tolerability profile . FUNDING Tibotec Abstract PURPOSE : A r and omized , open-label , pilot study was undertaken to explore the antiviral activity and tolerability of two nonnucleoside reverse transcriptase inhibitors ( NNRTIs ) , nevirapine ( NVP ) and efavirenz ( EFV ) . METHOD : HIV-infected antiretroviral-naive adults with CD4 counts > 100 cells/mm3 and detectable plasma HIV RNA below 100,000 copies/mL were r and omized to receive didanosine ( ddI ) and stavudine ( d4 T ) plus either NVP or EFV . Assessment s were made every 12 weeks . Primary endpoints were the proportion of patients reaching plasma HIV RNA < 50 copies/mL and /or developing NNRTI-related toxicities leading to drug discontinuation . Baseline characteristics were comparable for participants in the EFV ( n = 31 ) and NVP arms ( n = 36 ) . RESULTS : At 48 weeks , 23/31 ( 74 % ) patients in the EFV group and 23/36 ( 64 % ) in the NVP group had < 50 HIV RNA copies/mL ( intention-to-treat analysis ) . Adverse events led to NNRTI discontinuation in 4 and 3 patients in the EFV and NVP arms , respectively . There were no statistically significant differences between groups regarding any primary endpoint . NVP and EFV along with two NRTIs may be equally well tolerated and effective at achieving < 50 HIV RNA copies/mL in naive patients with CD4 counts > 100 cells/mm3 and HIV RNA < 105 copies/mL. CONCLUSION : A much larger study is needed to demonstrate any significant differences between NVP and EFV , if they exist at all BACKGROUND Vicriviroc ( VCV ) is a CCR5 antagonist with nanomolar activity against human immunodeficiency virus ( HIV ) replication in vitro and in vivo . We report the results of a phase II dose-finding study of VCV plus dual nucleoside reverse-transcriptase inhibitors ( NRTIs ) in the treatment-naive HIV-1-infected subjects . METHODS This study was a r and omized , double-blind , placebo-controlled trial that began with a 14-day comparison of 3 dosages of VCV with placebo in treatment-naive subjects infected with CCR5-using HIV-1 . After 14 days of monotherapy , lamivudine/zidovudine was added to the VCV arms ; subjects receiving placebo were treated with efavirenz and lamivudine/zidovudine ; the planned treatment duration was 48 weeks . RESULTS Ninety-two subjects enrolled . After 14 days of once-daily monotherapy , the mean viral loads decreased from baseline values by 0.07 log(10 ) copies/mL in the placebo arm , 0.93 log(10 ) copies/mL in the VCV 25 mg arm , 1.18 log(10 ) copies/mL in the VCV 50 mg arm , and 1.34 log(10 ) copies/mL in the VCV 75 mg arm ( P < .001 for each VCV arm vs. the placebo arm ) . The combination-therapy portion of the study was stopped because of increased rates of virologic failure in the VCV 25 mg/day arm ( relative hazard [ RH ] , 21.6 ; 95 % confidence interval [ CI ] , 2.8 - 168.9 ) and the VCV 50 mg/day arm ( RH , 11.7 ; 95 % CI , 1.5 - 92.9 ) , compared with that in the control arm . CONCLUSIONS VCV administered with dual NRTIs in treatment-naive subjects with HIV-1 infection had increased rates of virologic failure , compared with efavirenz plus dual NRTIs . No treatment-limiting toxicity was observed . Study of higher doses of VCV as part of combination therapy is warranted Objective : To compare the efficacy of efavirenz ( EFV ) vs lopinavir/ritonavir ( LPV/r ) in combination with azidothymidine/lamivudine in antiretroviral therapy naive , HIV+ individuals presenting for care with CD4 + counts < 200/mm3 . Methods : Prospect i ve , r and omized , open label , multicenter trial in Mexico . HIV-infected subjects with CD4 < 200/mm3 were r and omized to receive open label EFV or LPV/r plus azidothymidine/lamivudine ( fixed-dose combination ) for 48 weeks . R and omization was stratified by baseline CD4 + cell count ( ≤100 or > 100/mm3 ) . The primary endpoint was the percentage of patients with plasma HIV-1 RNA < 50 copies/mL at 48 weeks by intention-to-treat analysis . Results : A total of 189 patients ( 85 % men ) were r and omized to receive EFV ( 95 ) or LPV/r ( 94 ) . Median baseline CD4 + were 64 and 52/mm3 , respectively ( P = not significant ) . At week 48 , by intention-to-treat analysis , 70 % of EFV and 53 % of LPV/r patients achieved HIV-1 RNA < 50 copies/mL [ estimated difference 17 % ( 95 % confidence interval 3.5 to 31 ) , P = 0.013 ] . The proportion with HIV-1 RNA < 400 copies/mL was 73 % with EFV and 65 % with LPV/r ( P = 0.25 ) . Virologic failure occurred in 7 patients on EFV and 17 on LPV/r . Mean CD4 + count increases ( cells/mm3 ) were 234 for EFV and 239 for LPV/r . Mean change in total cholesterol and triglyceride levels were 50 and 48 mg/dL in EFV and 63 and 116 mg/dL in LPV/r ( P = 0.24 and P < 0.01 ) . Conclusions : In these very advanced HIV-infected ARV-naive subjects , EFV-based highly active antiretroviral therapy had superior virologic efficacy than LPV/r-based highly active antiretroviral therapy , with a more favorable lipid profile Background : National initiatives offering non-nucleoside reverse transcriptase inhibitor (NNRTI)-based combination antiretroviral therapy ( cART ) have exp and ed in sub-Saharan Africa . The Tshepo study is the first clinical trial evaluating the long-term efficacy and tolerability of efavirenz versus nevirapine-based cART among adults in Botswana . Methods : A 3-year r and omized study ( n = 650 ) using a 3 × 2 × 2 factorial design comparing efficacy and tolerability among : ( i ) zidovudine/lamivudine versus zidovudine/didanosine versus stavudine/lamivudine ; ( ii ) efavirenz versus nevirapine ; and ( iii ) community-based supervision versus st and ard adherence strategies . This paper focuses on comparison ( ii ) . Results : There was no significant difference by assigned NNRTI in time to virological failure with resistance ( log-rank P = 0.14 ) , nevirapine versus efavirenz [ risk ratio ( RR ) 1.54 , 95 % CI 0.86–2.70 ] . Rates of virological failure with resistance were 9.6 % nevirapine-treated ( 95 % CI 6.8–13.5 ) versus 6.6 % efavirenz-treated ( 95 % CI 4.2–10.0 ) at 3 years . Women receiving nevirapine-based cART trended towards higher virological failure rates when compared with efavirenz-treated women , Holm-corrected ( log-rank P = 0.072 ) , nevirapine versus efavirenz ( RR 2.22 , 95 % CI 0.94–5.00 ) . A total of 139 patients had 176 treatment-modifying toxicities , with a shorter time to event in nevirapine-treated versus efavirenz-treated patients ( RR 1.85 , 1.20–2.86 ; log-rank P = 0.0002 ) . Conclusion : Tshepo-treated patients had excellent overall immunological and virological outcomes , and no significant differences were observed by r and omized NNRTI comparison . Nevirapine-treated women trended towards higher virological failure with resistance compared with efavirenz-treated women . Nevirapine-treated adults had higher treatment modifying toxicity rates when compared with those receiving efavirenz . Nevirapine-based cART can continue to be offered to women in sub-Saharan Africa if patient education concerning toxicity is emphasized , routine safety monitoring chemistries are performed and the potential risk of efavirenz-related teratogenicity is considered BACKGROUND The use of either efavirenz or lopinavir-ritonavir plus two nucleoside reverse-transcriptase inhibitors ( NRTIs ) is recommended for initial therapy for patients with human immunodeficiency virus type 1 ( HIV-1 ) infection , but which of the two regimens has greater efficacy is not known . The alternative regimen of lopinavir-ritonavir plus efavirenz may prevent toxic effects associated with NRTIs . METHODS In an open-label study , we compared three regimens for initial therapy : efavirenz plus two NRTIs ( efavirenz group ) , lopinavir-ritonavir plus two NRTIs ( lopinavir-ritonavir group ) , and lopinavir-ritonavir plus efavirenz ( NRTI-sparing group ) . We r and omly assigned 757 patients with a median CD4 count of 191 cells per cubic millimeter and a median HIV-1 RNA level of 4.8 log10 copies per milliliter to the three groups . RESULTS At a median follow-up of 112 weeks , the time to virologic failure was longer in the efavirenz group than in the lopinavir-ritonavir group ( P=0.006 ) but was not significantly different in the NRTI-sparing group from the time in either of the other two groups . At week 96 , the proportion of patients with fewer than 50 copies of plasma HIV-1 RNA per milliliter was 89 % in the efavirenz group , 77 % in the lopinavir-ritonavir group , and 83 % in the NRTI-sparing group ( P=0.003 for the comparison between the efavirenz group and the lopinavir-ritonavir group ) . The groups did not differ significantly in the time to discontinuation because of toxic effects . At virologic failure , antiretroviral resistance mutations were more frequent in the NRTI-sparing group than in the other two groups . CONCLUSIONS Virologic failure was less likely in the efavirenz group than in the lopinavir-ritonavir group . The virologic efficacy of the NRTI-sparing regimen was similar to that of the efavirenz regimen but was more likely to be associated with drug resistance . ( Clinical Trials.gov number , NCT00050895 [ Clinical Trials.gov ] . ) BACKGROUND The side-effects of anti-retroviral drugs are different between Japanese and Caucasian patients . Severe central nerve system ( CNS ) side-effects to efavirenz and low rate of hypersensitivity against abacavir characterize the Japanese . OBJECTIVE The objective of this study was to select a once daily regimen for further non-inferior study comparing the virological efficacy and safety of the first line once daily antiretroviral treatment regimens in the current HIV/AIDS guideline . METHODS The study design was a r and omized , open label , multicenter , selection study . One arm was treated with efavirenz and the other with ritonavir-boosted atazanavir . A fixed-dose lamivudine plus abacavir were used in both arms . The primary endpoint was virologic success ( viral load less than 50 copies/mL ) rate at 48 weeks . Patients were followed-up to 96 weeks with safety as the secondary endpoint . Clinical trials . Gov ( NCT00280969 ) and the University hospital Medical Information Network ( UMIN000000243 ) . RESULTS A total of 71 participants were enrolled . Virologic success rates in both arms were similar at week 48 [ efavirenz arm 28/36 ( 77.8 % ) ; atazanavir arm 27/35 ( 77.1 % ) ] , but were decreased at week 96 to 55.6 % in the efavirenz arm and 68.8 % in the atazanavir arm ( p=0.33 ) . At the 96-week follow-up , 52.8 % of the EFV arm and 34.3 % of the ATV/r arm reached total cholesterol more than 220 mg/dL and required treatment . None of the patients developed cardiovascular complications in this study by week 96 . CONCLUSION There was no significant difference in the efficacy of efavirenz and ritonavir-boosted atazanavir combined with lamivudine plus abacavir at 48 weeks . The evaluation of safety was extended to 96 weeks , which also showed no significant difference in both arms BACKGROUND Dolutegravir ( S/GSK1349572 ) is a new HIV-1 integrase inhibitor that has antiviral activity with once daily , unboosted dosing . SPRING-1 is an ongoing study design ed to select a dose for phase 3 assessment . We present data from preplanned primary and interim analyses . METHODS In a phase 2b , multicentre , dose-ranging study , treatment-naive adults were r and omly assigned ( 1:1:1:1 ) to receive 10 mg , 25 mg , or 50 mg dolutegravir or 600 mg efavirenz . Dose but not drug allocation was masked . R and omisation was by a central integrated voice-response system according to a computer-generated code . Study drugs were given with either tenofovir plus emtricitabine or abacavir plus lamivudine . Our study was done at 34 sites in France , Germany , Italy , Russia , Spain , and the USA beginning on July 9 , 2009 . Eligible participants were seropositive for HIV-1 , aged 18 years or older , and had plasma HIV RNA viral loads of at least 1000 copies per mL and CD4 counts of at least 200 cells per μL. Our primary endpoint was the proportion of participants with viral load of less than 50 copies per mL at week 16 and we present data to week 48 . Analyses were done on the basis of allocation group and included all participants who received at least one dose of study drug . This study is registered with Clinical Trials.gov , number NCT00951015 . FINDINGS 205 patients were r and omly allocated and received at least one dose of study drug : 53 , 51 , and 51 to receive 10 mg , 25 mg , and 50 mg dolutegravir , respectively , and 50 to receive efavirenz . Week 16 response rates to viral loads of at most 50 copies per mL were 93 % ( 144 of 155 participants ) for all doses of dolutegravir ( with little difference between dose groups ) and 60 % ( 30 of 50 ) for efavirenz ; week 48 response rates were 87 % ( 139 of 155 ) for all doses of dolutegravir and 82 % ( 41 of 50 ) for efavirenz . Response rates between nucleoside reverse transcriptase inhibitor subgroups were similar . We identified three virological failures in the dolutegravir groups and one in the efavirenz group-we did not identify any integrase inhibitor mutations . We did not identify any dose-related clinical or laboratory toxic effects , with more drug-related adverse events of moderate-or-higher intensity in the efavirenz group ( 20 % ) than the dolutegravir group ( 8 % ) . We did not judge that any serious adverse events were related to dolutegravir . INTERPRETATION Dolutegravir was effective when given once daily without a pharmacokinetic booster and was well tolerated at all assessed doses . Our findings support the assessment of once daily 50 mg dolutegravir in phase 3 trials Background : Current guidelines state that the goal of antiretroviral therapy for HIV-infected individuals is to suppress plasma viral load ( pVL ) to below 400 copies/ml Methods : Predictors of achieving and maintaining pVL suppression were examined in a r and omized trial of combinations of zidovudine , nevirapine and didanosine in patients with CD4 + T cell counts of between 200 and 600 × 106 cells/I who were naive to antiretroviral therapy and AIDS-free at enrolment . Results : One hundred and four patients had pVL > 500 copies/ml at baseline and a pVL nadir below 500 copies/ml . Of these , 77 patients experienced an increase in pVL above 500 copies/ml . The median number of days of pVL suppression to below 500 copies/ml was 285 ( 42 ) for patients with pVL nadir ≤ ( > ) 20 copies/ml ( P = 00.0001 ) . The relative risk of an increase in pVL above 500 copies/ml associated with a pVL nadir below 20 copies/ml was 0.11 ( P = 0.0001 ) . The relative risks of an increase in pVL above 5000 copies/ml associated with a pVL nadir below 20 copies/ml or between 20 and 400 copies/ml were 0.05 [ 95 % confidence interval ( CI ) , 0.02–0.12 ] and 0.37 ( 95 % CI , 0.23–0.61 ) respectively , compared with individuals with a pVL nadir > 400 copies/ml . Individuals with a pVL nadir ≤ 20 copies/ml were at a significantly lower risk of virologic failure than individuals with a pVL nadir of between 21 and 400 copies/ml ( P = 0.0001 ) . Conclusions : Our results demonstrate that suppression of pVL below 20 copies/ml is necessary to achieve a long-term antiretroviral response . Our data support the need for a revision of current therapeutic guidelines for the management of HIV infection Late diagnosis of HIV-1 infection is quite frequent in Western countries . Very few r and omized clinical trials to determine the best antiretroviral treatment in patients with advanced HIV-1 infection have been performed . To compare immune reconstitution in two groups of very immunosuppressed ( less than 100 CD4(+ ) cells/microl ) , antiretroviral-naive HIV-1-infected adults , 65 patients were r and omly assigned in a 1:1 ratio to receive zidovudine + lamivudine + efavirenz ( group A , 34 patients ) or zidovudine + lamivudine + ritonavir-boosted indinavir ( group B , 31 patients ) . The median ( interquartile range ) CD4(+ ) cell increase after 12 and 36 months was + 199 ( 101 , 258 ) and + 299 ( 170 , 464 ) cells/microl in the efavirenz arm and + 136 ( 57 , 235 ) and + 228 ( 119 , 465 ) cells/microl in the ritonavir-boosted indinavir arm ( p > 0.05 for all time points ) . The proportion ( 95 % confidence interval ) of patients achieving HIV-1 RNA levels under 50 copies/ml was significantly greater in the efavirenz arm at 3 years by the intention-to-treat analysis [ 59 % ( 41 % , 75 % ) vs. 23 % ( 10 % , 41 % ) ] , whereas no differences were found in the on-treatment analysis . Immune activation ( CD8(+)CD38(+ ) and CD8(+)CD38DR(+ ) T cells ) was significantly lower for the efavirenz arm from month 6 to month 24 . Adverse events were more frequent in the ritonavir-boosted indinavir arm . Almost all cases of disease progression and death were observed in the first year of treatment , with no significant differences between the two arms ( p = 0.79 by the log-rank test ) . At 1 and 3 years , the immune reconstitution induced by an efavirenz-based regimen in very immunosuppressed patients was at least as potent as that induced by a ritonavir-boosted protease inhibitor-based antiretroviral regimen Background Glomerular filtration rate ( GFR ) estimated by Chronic Kidney Disease Epidemiology Collaboration ( CKD-EPI ) equation based on creatinine or cystatine C may be more accurate methods especially in patients without chronic kidney disease . There is lack of data on GFR estimated by these methods in patients on highly active antiretroviral therapy . Methods Antiretroviral-naive HIV-infected patients were r and omized to tenofovir/emtricitabine in association with atazanavir/ritonavir ( ATV/r ) or efavirenz ( EFV ) Patients had to have an actual creatinine clearance > 50 mL/minute ( 24-hour urine collection ) and were followed for 48 weeks . Results Ninety-one patients ( 48 ATV/r , 43 EFV ) were recruited . Using the CKD-EPI creatinine formula , there was a significant decrease in GFR up to week 48 in patients receiving ATV/r ( 4.9 mL/minute/m2 , P = 0.02 ) compared with a not statistically significant increment in patients prescribed EFV . Using the cystatin C – based equation , we found greater decrease in GFR in both arms , although , in the EFV arm , the decrease was not statistically significant ( 5.8 mL/minute/m2 , P = 0.92 ) . At multivariable analysis , ATV/r was a significant predictor of greater decrease in estimated glomerular filtration rate ( eGFR ) ( P = 0.0046 ) only with CKD-EPI creatinine . Conclusions ATV/r plus tenofovir caused greater GFR decreases compared with EFV . The evaluation of eGFR by cystatin C confirmed this result , but this method seemed to be more stringent , probably precluding the possibility to detect a significant difference in the pattern of eGFR evolution between the two arms over time . More studies are needed to underst and the clinical relevance of these alterations and whether cystatin C is a more appropriate method for monitoring GFR in clinical practice BACKGROUND Long-term data from r and omised trials on the consequences of treatment with a protease inhibitor ( PI ) , non-nucleoside reverse transcriptase inhibitor ( NNRTI ) , or both are lacking . Here , we report results from the FIRST trial , which compared initial treatment strategies for clinical , immunological , and virological outcomes . METHODS Between 1999 and 2002 , 1397 antiretroviral-treatment-naive patients , presenting at 18 clinical trial units with 80 research sites in the USA , were r and omly assigned in a ratio of 1:1:1 to a protease inhibitor ( PI ) strategy ( PI plus nucleoside reverse transcriptase inhibitor [ NRTI ] ; n=470 ) , a non-nucleoside reverse transcriptase inhibitor ( NNRTI ) strategy ( NNRTI plus NRTI ; n=463 ) , or a three-class strategy ( PI plus NNRTI plus NRTI ; n=464 ) . Primary endpoints were a composite of an AIDS-defining event , death , or CD4 cell count decline to less than 200 cells per mm3 for the PI versus NNRTI comparison , and average change in CD4 cell count at or after 32 months for the three-class versus combined two-class comparison . Analyses were by intention-to-treat . This study is registered Clinical Trials.gov , number NCT00000922 . FINDINGS 1397 patients were assessed for the composite endpoint . A total of 388 participants developed the composite endpoint , 302 developed AIDS or died , and 188 died . NNRTI versus PI hazard ratios ( HRs ) for the composite endpoint , for AIDS or death , for death , and for virological failure were 1.02 ( 95 % CI 0.79 - 1.31 ) , 1.07 ( 0.80 - 1.41 ) , 0.95 ( 0.66 - 1.37 ) , and 0.66 ( 0.56 - 0.78 ) , respectively . 1196 patients were assessed for the three-class versus combined two-class primary endpoint . Mean change in CD4 cell count at or after 32 months was + 234 cells per mm3 and + 227 cells per mm3 for the three-class and the combined two-class strategies ( p=0.62 ) , respectively . HRs ( three-class vs combined two-class ) for AIDS or death and virological failure were 1.15 ( 0.91 - 1.45 ) and 0.87 ( 0.75 - 1.00 ) , respectively . HRs ( three-class vs combined two-class ) for AIDS or death were similar for participants with baseline CD4 cell counts of 200 cells per mm3 or less and of more than 200 cells per mm3 ( p=0.38 for interaction ) , and for participants with baseline HIV RNA concentrations less than 100 000 copies per mL and 100,000 copies per mL or more ( p=0.26 for interaction ) . Participants assigned the three-class strategy were significantly more likely to discontinue treatment because of toxic effects than were those assigned to the two-class strategies ( HR 1.58 ; p<0.0001 ) . INTERPRETATION Initial treatment with either an NNRTI-based regimen or a PI-based regimen , but not both together , is a good strategy for long-term antiretroviral management in treatment-naive patients with HIV BACKGROUND This Phase IIb study explored the antiviral activity and safety of the investigational CCR5 antagonist aplaviroc ( APL ) in antiretroviral-naive patients harbouring R5-tropic virus . METHODS One hundred and forty-seven patients were r and omized 2:2:1 to one of two APL dosing regimens or efavirenz ( EFV ) . All dosage arms were administered twice daily and in combination with lamivudine/zidovudine ( 3TC/ZDV ; Combivir , COM ) . Efficacy , safety , and pharmacokinetic parameters were assessed . RESULTS This study was prematurely terminated due to APL-associated idiosyncratic hepatotoxicity . The primary endpoint of the study was the proportion of patients with plasma HIV-1 RNA < 400 copies/ml who remained on r and omized treatment through week 12 . Of the 147 patients enrolled , 145 patients received one dose of treatment and were included in the intention-to-treat population . The proportion of patients with HIV-1 RNA < 400 copies/ml at week 12 was 53 % , 50 % and 66 % in the APL 600 mg twice daily , APL 800 mg twice daily , and EFV arms , respectively . Common clinical adverse events ( AEs ) were diarrhoea , nausea , fatigue and headache . APL demonstrated non-linear pharmacokinetics with high interpatient variability . In addition to the hepatic findings , there was an apparent dose-response relationship in the incidence of diarrhoea . CONCLUSIONS Whereas target plasma concentrations of APL were achieved , the antiviral activity of APL as the third agent in a triple drug regimen did not appear to be comparable to EFV in this treatment-naive patient population Objectives : The purpose of this study was to evaluate the safety and efficacy of raltegravir vs efavirenz-based antiretroviral therapy after 96 weeks in treatment-naive patients with HIV-1 infection . Methods : Multicenter , double-blind , r and omized study of raltegravir ( 100 , 200 , 400 , or 600 mg twice a day ) vs efavirenz ( 600 mg every day ) , both with tenofovir/lamivudine ( TDF/3TC ) , for 48 weeks , after which raltegravir arms were combined and all dosed at 400 mg twice a day . Eligible patients had HIV-1 RNA ≥5000 copies per milliliter and CD4 + T cells ≥100 cells per microliter . Results : One hundred ninety-eight patients were r and omized and treated ; 160 received raltegravir and 38 received efavirenz . At week 96 , 84 % of patients in both groups achieved HIV-1 RNA < 400 copies per milliliter ; 83 % in the raltegravir group and 84 % in the efavirenz group achieved < 50 copies per milliliter ( noncompleter = failure ) . Both groups showed similar increases in CD4 + T cells ( 221 vs 232 cells/uL , respectively ) . An additional 2 patients ( 1 in each group ) met the protocol definition of virologic failure between weeks 48 and 96 ; no known resistance mutations were observed in the raltegravir recipient ; the efavirenz recipient had nucleoside reverse transcriptase inhibitor and nonnucleoside reverse transcriptase inhibitor resistance mutations . Investigator reported drug-related clinical adverse events ( AEs ) were less frequent with raltegravir ( 51 % ) than efavirenz ( 74 % ) . Drug-related AEs occurring in > 10 % of patients in either group were nausea in both groups and dizziness and headache in the efavirenz group . Laboratory AEs remained infrequent . Raltegravir had no adverse effect on total or low-density lipoprotein cholesterol or on triglycerides . Neuropsychiatric AEs remained less frequent with raltegravir ( 34 % ) than efavirenz ( 58 % ) . There were no drug-related serious AEs in patients receiving raltegravir . Conclusions : In antiretroviral therapy-naive patients , raltegravir with TDF/3TC had potent antiretroviral activity , which was similar to efavirenz/TDF/3TC and was sustained to week 96 . Raltegravir was generally well tolerated ; drug-related AEs were less frequent in patients treated with raltegravir compared with efavirenz BACKGROUND Regimens containing three nucleoside reverse-transcriptase inhibitors offer an alternative to regimens containing nonnucleoside reverse-transcriptase inhibitors or protease inhibitors for the initial treatment of human immunodeficiency virus type 1 ( HIV-1 ) infection , but data from direct comparisons are limited . METHODS This r and omized , double-blind study involved three antiretroviral regimens for the initial treatment of subjects infected with HIV-1 : zidovudine-lamivudine-abacavir , zidovudine-lamivudine plus efavirenz , and zidovudine-lamivudine-abacavir plus efavirenz . RESULTS We enrolled a total of 1147 subjects with a mean baseline HIV-1 RNA level of 4.85 log10 ( 71,434 ) copies per milliliter and a mean CD4 cell count of 238 per cubic millimeter were enrolled . A scheduled review by the data and safety monitoring board with the use of prespecified stopping boundaries led to a recommendation to stop the triple-nucleoside group and to present the results in the triple-nucleoside group in comparison with pooled data from the efavirenz groups . After a median follow-up of 32 weeks , 82 of 382 subjects in the triple-nucleoside group ( 21 percent ) and 85 of 765 of those in the combined efavirenz groups ( 11 percent ) had virologic failure ; the time to virologic failure was significantly shorter in the triple-nucleoside group ( P<0.001 ) . This difference was observed regardless of the pretreatment HIV-1 RNA stratum ( at least 100,000 copies per milliliter or below this level ; P < or = 0.001 for both comparisons ) . Changes in the CD4 cell count and the incidence of grade 3 or grade 4 adverse events did not differ significantly between the groups . CONCLUSIONS In this trial of the initial treatment of HIV-1 infection , the triple-nucleoside combination of abacavir , zidovudine , and lamivudine was virologically inferior to a regimen containing efavirenz and two or three nucleosides BACKGROUND Few r and omized trials comparing antiretroviral therapy ( ART ) regimens have been conducted in re source -limited setting s. METHODS In the Republic of South Africa , antiretroviral-naive human immunodeficiency virus (HIV)-infected individuals > 14 years old with a CD4 cell count < 200 cells/μL or a prior AIDS diagnosis were r and omized to receive efavirenz ( EFV ) or lopinavir/ritonavir ( LPV/r ) with either zidovudine ( ZDV ) plus didanosine ( ddI ) or stavudine ( d4 T ) plus lamivudine ( 3TC ) in an open-label , 2-by-2 factorial study and followed up for the primary outcome of AIDS or death and prespecified secondary outcomes , including CD4 cell count and viral load changes , treatment discontinuation , and grade 4 events . RESULTS In total , 1771 persons were r and omized and followed up for a median of 24.7 months . AIDS or death occurred in ( 1 ) 163 participants assigned EFV and 157 assigned LPV/r ( hazard ratio [ HR ] , 1.04 [ 95 % confidence interval { CI } , 0.84 - 1.30 ] ) and in ( 2 ) 170 participants assigned ZDV+ddI and 150 assigned d4T+3TC ( HR , 1.15 [ 95 % CI , 0.93 - 1.44 ] ) . HIV RNA levels were lower ( P < .001 ) and CD4 cell counts were greater ( P < .01 ) over follow-up for d4T+3TC versus ZDV+ddI. Rates of potentially life-threatening adverse events and overall treatment discontinuation were similar for d4T+3TC and ZDV+ddI ; however , more participants discontinued d4 T because of toxicity ( 12.6 % ) than other treatments ( < 5 % ) . CONCLUSION EFV and LPV/r are effective components of first-line ART . The poorer viral and immune responses with ZDV+ddI and the greater toxicity-associated discontinuation rate with d4T+3TC suggest that these treatments be used cautiously as initial therapy . TRIAL REGISTRATION Clinical Trials.gov identifier : NCT00342355 Purpose The relation between treatment outcome and trough plasma concentrations of efavirenz ( EFV ) , atazanavir ( ATV ) and lopinavir ( LPV ) was studied in a pharmacokinetic/pharmacodynamic sub study of the NORTHIV trial — a r and omised phase IV efficacy trial comparing antiretroviral-naïve human immunodeficiency virus-1-infected patients treated with ( 1 ) EFV + 2 nucleoside reverse transcriptase inhibitors ( 2NRTI ) once daily , ( 2 ) ritonavir-boosted ATV + 2NRTI once daily or ( 3 ) ritonavir-boosted LPV + 2NRTI twice daily . The findings were related to the generally cited minimum effective concentration levels for the respective drugs ( EFV 1,000 ng/ml , ATV 150 ng/ml , LPV 1,000 ng/ml ) . The relation between atazanavir-induced hyperbilirubinemia and virological efficacy was also studied . Methods Drug concentrations were sample d at weeks 4 and 48 and optionally at week 12 and analysed by high-performance liquid chromatography with UV detector . When necessary , trough values were imputed by assuming the reported average half-lives for the respective drugs . Outcomes up to week 48 are reported . Results No relation between plasma concentrations of EFV , ATV or LPV and virological failure , treatment withdrawal due to adverse effects or antiviral potency ( viral load decline from baseline to week 4 ) was demonstrated . Very few sample s were below the suggested minimum efficacy cut-offs , and their predictive value for treatment failure could not be vali date d. There was a trend toward an increased risk of virological failure in patients on ATV who had an average increase of serum bilirubin from baseline of < 25 μmol/l . Conclusions The great majority of treatment-naïve and adherent patients on st and ard doses of EFV , ritonavir-boosted ATV and ritonavir-boosted LPV have drug concentrations above that considered to deliver the maximum effect for the respective drug . The results do not support the use of routine therapeutic drug monitoring ( TDM ) for efficacy optimisation in treatment-naïve patients on these drugs , although TDM may still be of value in some cases of altered pharmacokinetics , adverse events or drug interactions . Serum bilirubin may be a useful marker of adherence to ATV therapy BACKGROUND Durable suppression of replication of the human immunodeficiency virus ( HIV ) depends on the use of potent , well-tolerated antiretroviral regimens to which patients can easily adhere . METHODS We conducted an open-label , noninferiority study involving 517 patients with HIV infection who had not previously received antiretroviral therapy and who were r and omly assigned to receive either a regimen of tenofovir disoproxil fumarate ( DF ) , emtricitabine , and efavirenz once daily ( tenofovir-emtricitabine group ) or a regimen of fixed-dose zidovudine and lamivudine twice daily plus efavirenz once daily ( zidovudine-lamivudine group ) . The primary end point was the proportion of patients without baseline resistance to efavirenz in whom the HIV RNA level was less than 400 copies per milliliter at week 48 of the study . RESULTS Through week 48 , significantly more patients in the tenofovir-emtricitabine group reached and maintained the primary end point of less than 400 copies of HIV RNA per milliliter than did those in the zidovudine-lamivudine group ( 84 percent vs. 73 percent , respectively ; 95 percent confidence interval for the difference , 4 to 19 percent ; P=0.002 ) . This difference excludes the inferiority of the tenofovir DF , emtricitabine , and efavirenz regimen , indicating a significantly greater response with this regimen . Significant differences were also seen in the proportion of patients with HIV RNA levels of less than 50 copies per milliliter ( 80 percent in the tenofovir-emtricitabine group vs. 70 percent in the zidovudine-lamivudine group ; 95 percent confidence interval for the difference , 2 to 17 percent ; P=0.02 ) and in increases in CD4 cell counts ( 190 vs. 158 cells per cubic millimeter , respectively ; 95 percent confidence interval for the difference , 9 to 55 ; P=0.002 ) . More patients in the zidovudine-lamivudine group than in the tenofovir-emtricitabine group had adverse events result ing in discontinuation of the study drugs ( 9 percent vs. 4 percent , respectively ; P=0.02 ) . In none of the patients did the K65R mutation develop . CONCLUSIONS Through week 48 , the combination of tenofovir DF and emtricitabine plus efavirenz fulfilled the criteria for noninferiority to a fixed dose of zidovudine and lamivudine plus efavirenz and proved superior in terms of virologic suppression , CD4 response , and adverse events result ing in discontinuation of the study drugs . ( Clinical Trials.gov number , NCT00112047 . BACKGROUND The integrase inhibitor elvitegravir ( EVG ) has been co-formulated with the CYP3A4 inhibitor cobicistat ( COBI ) , emtricitabine ( FTC ) , and tenofovir disoproxil fumarate ( TDF ) in a single tablet given once daily . We compared the efficacy and safety of EVG/COBI/FTC/TDF with st and ard of care-co-formulated efavirenz (EFV)/FTC/TDF-as initial treatment for HIV infection . METHODS In this phase 3 trial , treatment-naive patients from outpatient clinics in North America were r and omly assigned by computer-generated allocation sequence with a block size of four in a 1:1 ratio to receive EVG/COBI/FTC/TDF or EFV/FTC/TDF , once daily , plus matching placebo . Patients and study staff involved in giving study treatment , assessing outcomes , and collecting and analysing data were masked to treatment allocation . Eligibility criteria included screening HIV RNA concentration of 5000 copies per mL or more , and susceptibility to efavirenz , emtricitabine , and tenofovir . The primary endpoint was HIV RNA concentration of fewer than 50 copies per mL at week 48 . The study is registered with Clinical Trials.gov , number NCT01095796 . FINDINGS 700 patients were r and omly assigned and treated ( 348 with EVG/COBI/FTC/TDF , 352 with EFV/FTC/TDF ) . EVG/COBI/FTC/TDF was non-inferior to EFV/FTC/TDF ; 305/348 ( 87·6 % ) versus 296/352 ( 84·1 % ) of patients had HIV RNA concentrations of fewer than 50 copies per mL at week 48 ( difference 3·6 % , 95 % CI -1·6 % to 8·8 % ) . Proportions of patients discontinuing drugs for adverse events did not differ substantially ( 13/348 in the EVG/COBI/FTC/TDF group vs 18/352 in the EFV/FTC/TDF group ) . Nausea was more common with EVG/COBI/FTC/TDF than with EFV/FTC/TDF ( 72/348 vs 48/352 ) and dizziness ( 23/348 vs 86/352 ) , abnormal dreams ( 53/348 vs 95/352 ) , insomnia ( 30/348 vs 49/352 ) , and rash ( 22/348 vs 43/352 ) were less common . Serum creatinine concentration increased more by week 48 in the EVG/COBI/FTC/TDF group than in the EFV/FTC/TDF group ( median 13 μmol/L , IQR 5 to 20 vs 1 μmol/L , -6 to 8 ; p<0·001 ) . INTERPRETATION If regulatory approval is given , EVG/COBI/FTC/TDF would be the only single-tablet , once-daily , integrase-inhibitor-based regimen for initial treatment of HIV infection . FUNDING Gilead Sciences BACKGROUND Antiretroviral therapy is complicated by drug interactions and contraindications . Novel regimens are needed . METHODS This open label study r and omly assigned treatment-naive , human immunodeficiency virus (HIV)-infected subjects to receive tenofovir-emtricitabine with efavirenz ( Arm I ) , with ritonavir-boosted atazanavir ( Arm II ) , or with zidovudine/abacavir ( Arm III ) . Pair-wise comparisons of differences in time-weighted mean change from baseline plasma HIV-RNA to week 48 formed the primary analysis . Treatment arms were noninferior if the upper limit of the 95 % confidence interval ( CI ) was < 0.5 log(10 ) copies/mL. Secondary objectives included virologic , immunologic and safety end points . RESULTS The intention-to-treat population comprised 322 patients ( Arm I , n = 114 ; Arm II , n = 105 ; and Arm III , n = 103 ) . Noninferiority for the primary end point was established . Analysis for superiority showed that Arm III was significantly less potent than Arm I ( -0.20 log(10 ) copies/mL ; 95 % CI , -0.39 to -0.01 log(10 ) copies/mL ; P = .038 ) . The proportions of patients on each of Arm I ( 95 % ) and Arm II ( 96 % ) with < 200 copies/mL were not different ( P = .75 ) , but the percentage of patients in Arm III with < 200 copies/mL ( 82 % ) was significantly lower ( P = .005 ) . CD4 + cell counts did not differ . Serious adverse events were more frequent in Arm III ( n = 30 ) than in Arm I or Arm II ( n = 15 for each ; P = .062 ) . CONCLUSIONS A novel quadruple nucleo(t)side combination demonstrated significantly less suppression of HIV replication , compared with the suppression demonstrated by st and ard antiretroviral therapy regimens , although it did meet the predetermined formal definition of noninferiority . Secondary analyses indicated statistically inferior virologic and safety performance . Efavirenz and ritonavir-boosted atazanavir arms were equivalent in viral suppression and safety Background : Induction-maintenance strategies were associated with a low response rate . We compared the virological response with two different induction regimens with trizivir plus efavirenz or lopinavir/ritonavir . Methods : A r and omized , multicentre , open-label clinical trial with 209 antiretroviral-naive HIV-infected patients assigned to trizivir plus either efavirenz or lopinavir/ritonavir during 24–36 weeks . Patients reaching undetectable plasma viral loads during induction entered a 48-week maintenance on trizivir alone . The primary endpoint was the proportion of patients without treatment failure at 72 weeks using an intent to treat ( ITT ) analysis ( switching equals failure ) . Results : Patients were r and omly assigned ( efavirenz 104 ; lopinavir/ritonavir 105 ) , and 114 ( 55 % ) entered the maintenance phase ( efavirenz 54 ; lopinavir/ritonavir 60 ) . Baseline characteristics were balanced between groups . The response rate at 72 weeks was 31 and 43 % ( ITT analysis , P = 0.076 ) and 63 and 75 % ( on-treatment analysis , P = 0.172 ) in the efavirenz and lopinavir/ritonavir arms , respectively . Virological failure occurred in 27 patients : six during induction ( efavirenz , three ; lopinavir/ritonavir , three ; P = 1.0 ) and 21 during maintenance ( efavirenz , 14 ; lopinavir/ritonavir , seven ; P = 0.057 ) . Thirty-four patients in the efavirenz arm switched treatment because of adverse events compared with 25 in the lopinavir/ritonavir arm ( P = 0.17 ) . Conclusion : Trizivir plus either efavirenz or lopinavir/ritonavir followed by maintenance with trizivir achieved a low but similar response at 72 weeks , with a high incidence of adverse events leading to drug discontinuation during the induction phase in both arms . The study showed a trend towards an increased virological failure rate in the efavirenz arm during the maintenance phase BACKGROUND Dolutegravir ( S/GSK1349572 ) , a once-daily , unboosted integrase inhibitor , was recently approved in the United States for the treatment of human immunodeficiency virus type 1 ( HIV-1 ) infection in combination with other antiretroviral agents . Dolutegravir , in combination with abacavir-lamivudine , may provide a simplified regimen . METHODS We conducted a r and omized , double-blind , phase 3 study involving adult participants who had not received previous therapy for HIV-1 infection and who had an HIV-1 RNA level of 1000 copies per milliliter or more . Participants were r and omly assigned to dolutegravir at a dose of 50 mg plus abacavir-lamivudine once daily ( DTG-ABC-3TC group ) or combination therapy with efavirenz-tenofovir disoproxil fumarate (DF)-emtricitabine once daily ( EFV-TDF-FTC group ) . The primary end point was the proportion of participants with an HIV-1 RNA level of less than 50 copies per milliliter at week 48 . Secondary end points included the time to viral suppression , the change from baseline in CD4 + T-cell count , safety , and viral resistance . RESULTS A total of 833 participants received at least one dose of study drug . At week 48 , the proportion of participants with an HIV-1 RNA level of less than 50 copies per milliliter was significantly higher in the DTG-ABC-3TC group than in the EFV-TDF-FTC group ( 88 % vs. 81 % , P=0.003 ) , thus meeting the criterion for superiority . The DTG-ABC-3TC group had a shorter median time to viral suppression than did the EFV-TDF-FTC group ( 28 vs. 84 days , P<0.001 ) , as well as greater increases in CD4 + T-cell count ( 267 vs. 208 per cubic millimeter , P<0.001 ) . The proportion of participants who discontinued therapy owing to adverse events was lower in the DTG-ABC-3TC group than in the EFV-TDF-FTC group ( 2 % vs. 10 % ) ; rash and neuropsychiatric events ( including abnormal dreams , anxiety , dizziness , and somnolence ) were significantly more common in the EFV-TDF-FTC group , whereas insomnia was reported more frequently in the DTG-ABC-3TC group . No participants in the DTG-ABC-3TC group had detectable antiviral resistance ; one tenofovir DF-associated mutation and four efavirenz-associated mutations were detected in participants with virologic failure in the EFV-TDF-FTC group . CONCLUSIONS Dolutegravir plus abacavir-lamivudine had a better safety profile and was more effective through 48 weeks than the regimen with efavirenz-tenofovir DF-emtricitabine . ( Funded by ViiV Healthcare ; SINGLE Clinical Trials.gov number , NCT01263015 . ) Abstract : We report week 96 results from a phase 3 trial of elvitegravir/cobicistat/emtricitabine/tenofovir disoproxil fumarate ( EVG/COBI/FTC/TDF , n = 348 ) vs efavirenz/emtricitabine/tenofovir disoproxil fumarate ( EFV/FTC/TDF , n = 352 ) . At week 48 , EVG/COBI/FTC/TDF was noninferior to EFV/FTC/TDF ( 88 % vs 84 % , difference + 3.6 % , 95 % confidence interval : −1.6 % to 8.8 % ) . Virologic success ( HIV-1 RNA < 50 copies/mL ) was maintained at week 96 ( 84 % vs 82 % , difference + 2.7 % , 95 % CI : −2.9 % to 8.3 % ) . Discontinuation due to adverse events was low ( 5 % vs 7 % ) . Median changes in serum creatinine ( mg/dL ) at week 96 were similar to week 48 . These results support the durable efficacy and long-term safety of EVG/COBI/FTC/TDF Background : Although efavirenz is a universally recommended treatment for naive HIV-infected individuals , neuropsychiatric adverse events are common . Methods : The Study of Efavirenz NeuropSychiatric Events versus Etravirine ( SENSE ) trial is a double-blind , placebo-controlled study in which 157 treatment-naive individuals with HIV-RNA higher than 5000 copies/ml were r and omized to etravirine 400 mg once daily ( n = 79 ) or to efavirenz 600 mg once daily ( n = 78 ) , with two investigator-selected nucleoside reverse transcriptase inhibitors ( NRTIs ) . The primary end point was the percentage of patients with grade 1–4 drug-related treatment-emergent neuropsychiatric adverse events up to week 12 . Results : The study population were 81 % men and 85 % whites , with a median age of 36 years , baseline CD4 cell counts of 302 cells/μl and HIV-RNA of 4.8 log10 copies/ml . In the intent-to-treat analysis , 13 of 79 individuals ( 16.5 % ) in the etravirine arm and 36 of 78 individuals ( 46.2 % ) in the efavirenz arm showed at least one grade 1–4 drug-related treatment-emergent neuropsychiatric adverse event ( P < 0.001 ) . The number with at least one grade 2–4 drug-related treatment-emergent neuropsychiatric adverse event was four of 79 individuals ( 5.1 % ) in the etravirine arm and 13 of 78 individuals ( 16.7 % ) in the efavirenz arm ( P = 0.019 ) . The change in HIV-RNA to week 12 was −2.9 log10 in both treatment arms . The median rise in CD4 cell counts was 146 cells/μl in the etravirine arm and 121 cells/μl in the efavirenz arm . Conclusions : After 12 weeks , first-line treatment with etravirine 400 mg once daily with two NRTIs was associated with significantly fewer neuropsychiatric adverse events when compared with efavirenz with two NRTIs . The virological and immunological efficacy profile was similar between the two arms BACKGROUND Dolutegravir has been shown to be non-inferior to an integrase inhibitor and superior to a non-nucleoside reverse transcriptase inhibitor ( NNRTI ) . In FLAMINGO , we compared dolutegravir with darunavir plus ritonavir in individuals naive for antiretroviral therapy . METHODS In this multicentre , open-label , phase 3b , non-inferiority study , HIV-1-infected antiretroviral therapy-naive adults with HIV-1 RNA concentration of 1000 copies per mL or more and no resistance at screening were r and omly assigned ( 1:1 ) to receive either dolutegravir 50 mg once daily or darunavir 800 mg plus ritonavir 100 mg once daily , with investigator-selected tenofovir-emtricitabine or abacavir-lamivudine . R and omisation was stratified by screening HIV-1 RNA ( ≤100,000 or > 100,000 copies per mL ) and nucleoside reverse transcriptase inhibitor ( NRTI ) selection . The primary endpoint was the proportion of patients with HIV-1 RNA concentration lower than 50 copies per mL ( Food and Drug Administration [ FDA ] snapshot algorithm ) at week 48 with a 12 % non-inferiority margin . This trial is registered with Clinical Trials.gov , NCT01449929 . FINDINGS Recruitment began on Oct 31 , 2011 , and was completed on May 24 , 2012 , in 64 research centres in nine countries worldwide . Of 595 patients screened , 484 patients were included in the analysis ( 242 in each group ) . At week 48 , 217 ( 90 % ) patients receiving dolutegravir and 200 ( 83 % ) patients receiving darunavir plus ritonavir had HIV-1 RNA of less than 50 copies per mL ( adjusted difference 7·1 % , 95 % CI 0·9 - 13·2 ) , non-inferiority and on pre-specified secondary analysis dolutegravir was superior ( p=0·025 ) . Confirmed virological failure occurred in two ( < 1 % ) patients in each group ; we recorded no treatment-emergent resistance in either group . Discontinuation due to adverse events or stopping criteria was less frequent for dolutegravir ( four [ 2 % ] patients ) than for darunavir plus ritonavir ( ten [ 4 % ] patients ) and contributed to the difference in response rates . The most commonly reported ( ≥10 % ) adverse events were diarrhoea ( dolutegravir 41 [ 17 % ] patients vs darunavir plus ritonavir 70 [ 29 % ] patients ) , nausea ( 39 [ 16 % ] vs 43 [ 18 % ] ) , and headache ( 37 [ 15 % ] vs 24 [ 10 % ] ) . Patients receiving dolutegravir had significantly fewer low-density lipoprotein values of grade 2 or higher ( 11 [ 2 % ] vs 36 [ 7 % ] ; p=0·0001 ) . INTERPRETATION Once-daily dolutegravir was superior to once-daily darunavir plus ritonavir . Once-daily dolutegravir in combination with fixed-dose NRTIs represents an effective new treatment option for HIV-1-infected , treatment-naive patients . FUNDING ViiV Healthcare and Shionogi & BACKGROUND Use of raltegravir with optimum background therapy is effective and well tolerated in treatment-experienced patients with multidrug-resistant HIV-1 infection . We compared the safety and efficacy of raltegravir with efavirenz as part of combination antiretroviral therapy for treatment-naive patients . METHODS Patients from 67 study centres on five continents were enrolled between Sept 14 , 2006 , and June 5 , 2008 . Eligible patients were infected with HIV-1 , had viral RNA ( vRNA ) concentration of more than 5000 copies per mL , and no baseline resistance to efavirenz , tenofovir , or emtricitabine . Patients were r and omly allocated by interactive voice response system in a 1:1 ratio ( double-blind ) to receive 400 mg oral raltegravir twice daily or 600 mg oral efavirenz once daily , in combination with tenofovir and emtricitabine . The primary efficacy endpoint was achievement of a vRNA concentration of less than 50 copies per mL at week 48 . The primary analysis was per protocol . The margin of non-inferiority was 12 % . This study is registered with Clinical Trials.gov , number NCT00369941 . FINDINGS 566 patients were enrolled and r and omly allocated to treatment , of whom 281 received raltegravir , 282 received efavirenz , and three were never treated . At baseline , 297 ( 53 % ) patients had more than 100 000 vRNA copies per mL and 267 ( 47 % ) had CD4 counts of 200 cells per microL or less . The main analysis ( with non-completion counted as failure ) showed that 86.1 % ( n=241 patients ) of the raltegravir group and 81.9 % ( n=230 ) of the efavirenz group achieved the primary endpoint ( difference 4.2 % , 95 % CI -1.9 to 10.3 ) . The time to achieve such viral suppression was shorter for patients on raltegravir than on efavirenz ( log-rank test p<0.0001 ) . Significantly fewer drug-related clinical adverse events occurred in patients on raltegravir ( n=124 [ 44.1 % ] ) than those on efavirenz ( n=217 [ 77.0 % ] ; difference -32.8 % , 95 % CI -40.2 to -25.0 , p<0.0001 ) . Serious drug-related clinical adverse events occurred in less than 2 % of patients in each drug group . INTERPRETATION Raltegravir-based combination treatment had rapid and potent antiretroviral activity , which was non-inferior to that of efavirenz at week 48 . Raltegravir is a well tolerated alternative to efavirenz as part of a combination regimen against HIV-1 in treatment-naive patients . FUNDING Merck BACKGROUND The non-nucleoside reverse transcriptase inhibitor ( NNRTI ) , rilpivirine ( TMC278 ; Tibotec Pharmaceuticals , County Cork , Irel and ) , had equivalent sustained efficacy to efavirenz in a phase 2b trial in treatment-naive patients infected with HIV-1 , but fewer adverse events . We aim ed to assess non-inferiority of rilpivirine to efavirenz in a phase 3 trial with common background nucleoside or nucleotide reverse transcriptase inhibitors ( N[t]RTIs ) . METHODS We undertook a 96-week , phase 3 , r and omised , double-blind , double-dummy , non-inferiority trial in 98 hospitals or medical centres in 21 countries . We enrolled adults ( ≥18 years ) not previously given antiretroviral therapy and with a screening plasma viral load of 5000 copies per mL or more and viral sensitivity to background N(t)RTIs . We r and omly allocated patients ( 1:1 ) using a computer-generated interactive web-response system to receive oral rilpivirine 25 mg once daily or efavirenz 600 mg once daily ; all patients received an investigator-selected regimen of background N(t)RTIs ( tenofovir-disoproxil-fumarate plus emtricitabine , zidovudine plus lamivudine , or abacavir plus lamivudine ) . The primary outcome was non-inferiority ( 12 % margin on logistic regression analysis ) at 48 weeks in terms of confirmed response ( viral load < 50 copies per mL , defined by the intent-to-treat time to loss of virologic response [ TLOVR ] algorithm ) in all patients who received at least one dose of study drug . This study is registered with Clinical Trials.gov , number NCT00543725 . FINDINGS From May 22 , 2008 , we screened 947 patients and enrolled 340 to each group . 86 % of patients ( 291 of 340 ) who received at least one dose of rilpivirine responded , compared with 82 % of patients ( 276 of 338 ) who received at least one dose of efavirenz ( difference 3.5 % [ 95 % CI -1.7 to 8.8 ] ; p(non-inferiority)<0.0001 ) . Increases in CD4 cell counts were much the same between groups . 7 % of patients ( 24 of 340 ) receiving rilpivirine had a virological failure compared with 5 % of patients ( 18 of 338 ) receiving efavirenz . 4 % of patients ( 15 ) in the rilpivirine group and 7 % ( 25 ) in the efavirenz group discontinued treatment due to adverse events . Grade 2 - 4 treatment-related adverse events were less common with rilpivirine ( 16 % [ 54 patients ] ) than they were with efavirenz ( 31 % [ 104 ] ; p<0.0001 ) , as were rash and dizziness ( p<0.0001 for both ) and increases in lipid levels were significantly lower with rilpivirine than they were with efavirenz ( p<0.0001 ) . INTERPRETATION Despite a slightly increased incidence of virological failures , a favourable safety profile and non-inferior efficacy compared with efavirenz means that rilpivirine could be a new treatment option for treatment-naive patients infected with HIV-1 . FUNDING Tibotec BACKGROUND The 2NN Study was a r and omised comparison of the non-nucleoside reverse-transcriptase inhibitors ( NNRTI ) nevirapine and efavirenz . METHODS In this multicentre , open-label , r and omised trial , 1216 antiretroviral-therapy-naive patients were assigned nevirapine 400 mg once daily , nevirapine 200 mg twice daily , efavirenz 600 mg once daily , or nevirapine ( 400 mg ) and efavirenz ( 800 mg ) once daily , plus stavudine and lamivudine , for 48 weeks . The primary endpoint was the proportion of patients with treatment failure ( less than 1 log(10 ) decline in plasma HIV-1 RNA in the first 12 weeks or two consecutive measurements of more than 50 copies per mL from week 24 onwards , disease progression [ new Centers for Disease Control and Prevention grade C event or death ] , or change of allocated treatment ) . Analyses were by intention to treat . FINDINGS Treatment failure occurred in 96 ( 43.6 % ) of 220 patients assigned nevirapine once daily , 169 ( 43.7 % ) of 387 assigned nevirapine twice daily , 151 ( 37.8 % ) of 400 assigned efavirenz , and 111 ( 53.1 % ) of 209 assigned nevirapine plus efavirenz . The difference between nevirapine twice daily and efavirenz was 5.9 % ( 95 % CI -0.9 to 12.8 ) . There were no significant differences among the study groups in the proportions with plasma HIV-1 RNA concentrations below 50 copies per mL at week 48 ( p=0.193 ) or the increases in CD4-positive cells ( p=0.800 ) . Nevirapine plus efavirenz was associated with the highest frequency of clinical adverse events , and nevirapine once daily with significantly more hepatobiliary laboratory toxicities than efavirenz . Of 25 observed deaths , two were attributed to nevirapine . INTERPRETATION Antiretroviral therapy with nevirapine or efavirenz showed similar efficacy , so triple-drug regimens with either NNRTI are valid for first-line treatment . There are , however , differences in safety profiles . Combination of nevirapine and efavirenz did not improve efficacy but caused more adverse events Background : The Study of Etravirine Neuropsychiatric Symptoms versus Efavirenz ( SENSE ) trial compared etravirine with efavirenz in treatment-naive patients . The primary endpoint was neuropsychiatric adverse events up to week 12 ; HIV RNA suppression at week 48 was a secondary endpoint . Methods : Patients with HIV RNA more than 5000 copies/ml were r and omized to etravirine 400 mg once daily ( n = 79 ) or efavirenz ( n = 78 ) , plus two nucleoside analogues . HIV RNA less than 50 copies/ml at week 48 was analysed using the time to loss of virological response ( TLOVR ) algorithm . Drug resistance at treatment failure and safety endpoints were also evaluated . Results : At baseline , the median CD4 cell count was 302 cells/&mgr;l and HIV RNA was 4.8 log10 copies/ml . In the intent to treat TLOVR analysis at week 48 , 60 of 79 ( 76 % ) patients on etravirine versus 58 of 78 ( 74 % ) on efavirenz had HIV RNA less than 50 copies/ml . In the on-treatment analysis , 60 of 65 ( 92 % ) taking etravirine had HIV RNA les than 50 copies/ml versus 58 of 65 ( 89 % ) for efavirenz : etravirine showed noninferior efficacy versus efavirenz in both analyses ( P < 0.05 ) . Four patients had virological failure in the etravirine arm : none developed resistance to nucleoside analogues or nonnucleosides . Seven patients had virological failure in the efavirenz arm : three developed treatment-emergent resistance to nucleoside analogues and /or nonnucleosides . At the week 48 visit , the percentage with ongoing neuropsychiatric adverse events was 6.3 % for etravirine and 21.5 % for efavirenz ( P = 0.011 ) . Conclusion : First-line treatment with etravirine 400 mg once daily and two nucleoside reverse transcriptase inhibitors ( NRTIs ) led to similar rates of HIV RNA suppression , compared with efavirenz and two NRTIs . None of the patients with virological failure in the etravirine arm developed resistance to nonnucleosides BACKGROUND The introduction of combination antiretroviral therapy and protease inhibitors has led to reports of falling mortality rates among people infected with HIV-1 . We examined the change in these mortality rates of HIV-1-infected patients across Europe during 1994 - 98 , and assessed the extent to which changes can be explained by the use of new therapeutic regimens . METHODS We analysed data from EuroSIDA , which is a prospect i ve , observational , European , multicentre cohort of 4270 HIV-1-infected patients . We compared death rates in each 6 month period from September , 1994 , to March , 1998 . FINDINGS By March , 1998 , 1215 patients had died . The mortality rate from March to September , 1995 , was 23.3 deaths per 100 person-years of follow-up ( 95 % CI 20.6 - 26.0 ) , and fell to 4.1 per 100 person-years of follow-up ( 2.3 - 5.9 ) between September , 1997 , and March , 1998 . From March to September , 1997 , the death rate was 65.4 per 100 person-years of follow-up for those on no treatment , 7.5 per 100 person-years of follow-up for patients on dual therapy , and 3.4 per 100 person-years of follow-up for patients on triple-combination therapy . Compared with patients who were followed up from September , 1994 , to March , 1995 , patients seen between September , 1997 , and March , 1998 , had a relative hazard of death of 0.16 ( 0.08 - 0.32 ) , which rose to 0.90 ( 0.50 - 1.64 ) after adjustment for treatment . INTERPRETATION Death rates across Europe among patients infected with HIV-1 have been falling since September , 1995 , and at the beginning of 1998 were less than a fifth of their previous level . A large proportion of the reduction in mortality could be explained by new treatments or combinations of treatments Objective : TMC278 is a next-generation nonnucleoside reverse transcriptase inhibitor highly active against wild-type and nonnucleoside reverse transcriptase inhibitor-resistant HIV-1 in vitro . The week 96 analysis of TMC278-C204 , a large dose-ranging study of TMC278 in treatment-naive HIV-1-infected patients , is presented . Design : Phase IIb r and omized trial . Methods : Three hundred sixty-eight patients were r and omized and treated with three blinded once-daily TMC278 doses 25 , 75 or 150 mg , or an open-label , active control , efavirenz 600 mg once daily , all with two nucleoside reverse transcriptase inhibitors . The primary analysis was at week 48 . Results : No TMC278 dose – response relationship for efficacy and safety was observed . TMC278 demonstrated potent antiviral efficacy comparable with efavirenz over 48 weeks that was sustained to week 96 ( 76.9–80.0 % and 71.4–76.3 % of TMC278-treated patients with confirmed viral load < 50 copies/ml , respectively ; time-to-loss of virological-response algorithm ) . Median increases from baseline in CD4 cell count with TMC278 at week 96 ( 138.0–149.0 cells/μl ) were higher than at week 48 ( 108.0–123.0 cells/μl ) . All TMC278 doses were well tolerated . The incidences of the most commonly reported grade 2–4 adverse events at least possibly related to study medication , including nausea , dizziness , abnormal dreams/nightmare , dyspepsia , asthenia , rash , somnolence and vertigo , were low and lower with TMC278 than with efavirenz . Incidences of serious adverse events , grade 3 or 4 adverse events and discontinuations due to adverse events were similar among groups . Conclusion : All TMC278 doses demonstrated potent and sustained efficacy comparable with efavirenz in treatment-naive patients over 96 weeks . TMC278 was well tolerated with lower incidences of neurological and psychiatric adverse events , rash and lower lipid elevations than those with efavirenz . TMC278 25 mg once daily was selected for further clinical development A prospect i ve , r and omized pilot trial was conducted in naive patients comparing three different combinations : zidovudine+lamivudine+lopinavir/ritonavir ( arm A ) versus tenofovir+lamivudine+efavirenz ( arm B ) versus tenofovir+didanosine+efavirenz ( arm C ) . HIV-RNA slope ( days 1 , 3 , 7 , 14 and 28 ) was slower in arm C with respect to arm B ( P < 0.0001 ) . Seven out of eight patients ( 87.5 % ) reached undetectable HIV-RNA by week 28 in arm A , 10/10 ( 100 % ) in arm B and 6/10 ( 60 % ) in arm C. Among arm C patients who failed at week 4 , one HIV isolate showed 67N and 219Q , and another one showed 210F and 215D substitutions in the HIV reverse transcriptase gene at baseline , respectively . Non-nucleoside reverse transcriptase inhibitor resistance-related mutations appeared first , followed by 65R mutations in all cases . Efavirenz AUC(0 - 24 ) values were lower in arm C with respect to arm B , especially in patients who failed early . A high virological failure rate after tenofovir+didanosine+efavirenz correlated with a slower HIV-RNA decrease and a peculiar accumulation of resistance mutations . A constellation of factors could be correlated with early failure events in patients receiving this combination such as resistance mutations or polymorphisms present at baseline , low CD4 + T-cell count or advanced disease and unexpectedly low efavirenz plasma levels Abstract Purpose : To compare long-term virologic , immunologic , and clinical outcomes in antiretroviral-naïve persons starting efavirenz (EFV)- versus nevirapine (NVP)-based regimens . Method : The FIRST study r and omized patients into three strategy arms : PI+NRTI , NNRTI+NRTI , and PI+NNRTI+NRTI . NNRTI was determined by optional r and omization ( NVP or EFV ) or by choice . For this r and omized sub study , the primary endpoint was HIV RNA > 50 copies/mL after 8 months or death . Genotypic resistance testing was done at virologic failure ( VF ; HIV RNA > 1000 copies/mL at or after 4 months ) . Results : 228 persons ( 111 r and omized to EFV , 117 to NVP ) were followed for median 5 years . Rates per 100 person years for the primary endpoint were 41.2 ( EFV ) and 42.8 ( NVP ; p = .59 ) . The percent of persons with HIV RNA < 50 copies/mL was similar throughout follow-up ( p = .24 ) , as were average increases in CD4 + cells ( p = .30 ) . 423 persons declining the sub study chose EFV ; 264 chose NVP . There were 915 persons in the combined cohort ( r and omized and choice ) . In the combined cohort , the risk of VF and VF with any NNRTI or NRTI resistance or resistance of any class was significantly less for EFV compared to NVP . Conclusions : EFV-based regimens as initial therapy result ed in a lower risk of VF and VF with resistance than did NVP-based regimens , although immunologic and clinical outcomes were similar ( i ) To compare early decrease of HIV plasma viral load ( pVL ) after two st and ard combinations of highly active antiretroviral therapy ( HAART ) . ( ii ) To evaluate variations of proviral HIV-DNA load on conditions of sustained pVL undetectability . Two different sub- studies of a multicentre prospect i ve r and omized controlled trial which compared two first-line HAART ( i.e. , zidovudine+lamivudine+lopinavir/ritonavir versus tenofovir+lamivudine+ efavirenz ) . Only patients enrolled at the coordinating centre ( University of Brescia ) were included in the two sub- studies . In the first sub- study , we calculated pVL decrease with respect to baseline at any of the following time-points : days 1 , 3 , 7 , 14 and 28 . Decreases of the pVL were compared between the two treatment groups . In the second sub- study , we analyzed variation of proviral HIV-DNA load in CD4 + T-cells from baseline to week 52 only in patients who maintained the same treatment regimen and had sustained undetectable pVL . In either studies , linear regression analysis was used to investigate what factors could influence variations of pVL and of proviral HIV-DNA load . ( i ) 64 patients were studied . A significant decrease of pVL was found from day 3 on , without statistically significant differences between the two study groups . However , after adjusting for possible confounders , tenofovir+lamivudine+efavirenz result ed to be associated with greater pVL decreases . ( ii ) 45 patients were studied . Mean proviral HIV-DNA load decreased from 1,610 ( 95%CI : 879 - 2,341 ) to 896 ( 95 % CI 499 - 1,293 ) copies/10(6 ) cells ( P=0.05 ) . Linear regression analysis showed that the decrease of proviral DNA load during follow-up was independently and inversely correlated with age . Further studies are needed to compare pVL decay between antiretroviral regimens and assess whether proviral HIV-DNA load is a surrogate marker of treatment effectiveness Objective : To compare alternative class-sparing antiretroviral regimens in treatment-naive subjects . Design : Open-label , multicenter , r and omized trial of up to 3 consecutive treatment regimens over 96 weeks . Methods : Two hundred ninety-one subjects received abacavir ( ABC ) and lamivudine and efavirenz ( nonnucleoside reverse transcriptase inhibitors [ NNRTIs ] ) , ritonavir-boosted amprenavir ( protease inhibitor [ PI ] ) , or stavudine ( nucleoside reverse transcriptase inhibitor [ NRTI ] ) by r and om assignment . The primary end points were the percentages of subjects with plasma HIV-1 RNA levels < 400 copies/mL and time to treatment failure over 96 weeks . Results : Ninety percent of subjects completed 96 weeks of follow-up , and 79 % remained on study treatment . At week 96 , there were no differences between arms in the percentages of subjects with plasma HIV-1 RNA levels < 400 and < 50 copies/mL , mean changes in plasma HIV-1 RNA levels , time to treatment failure , time to first or second virologic failure , or CD4 + cell counts . The NNRTI arm had a greater percentages of subjects with RNA levels ≤50 copies/mL at weeks 24 and 48 and a greater overall duration of plasma HIV-1 RNA levels < 400 copies/mL. Three subjects in the NNRTI arm had treatment failure on their first regimen and switched therapy compared with 16 in the NRTI arm and 13 in the PI arm . Twenty-one subjects had hypersensitivity reactions attributed to ABC ( 7.3 % ) . Fewer drugs were used by subjects in the NNRTI arm , and fewer subjects in the NNRTI arm used 3 drug classes . Conclusions : All treatment regimens demonstrated excellent 96-week results . Secondary analyses favored the NNRTI regimen over the PI and NRTI regimens Antiretroviral-naive HIV-1-infected volunteers received zidovudine/lamivudine plus either lopinavir/ritonavir ( n=104 ) or efavirenz ( n=51 ) . Lopinavir/ritonavir-treated subjects demonstrating 3 consecutive monthly HIV-1 RNA levels < 50 copies/mL started lopinavir/ritonavir monotherapy . In previous-failure = failure analysis , 48 % ( lopinavir/ritonavir ) and 61 % ( efavirenz ) maintained HIV-1 RNA at < 50 copies/mL through week 96 , ( P= .17 ; 95 % confidence interval [ CI ] for the difference , -29 % to 4 % ) ; in noncompletion = failure analysis , 60 % ( lopinavir/ritonavir ) and 63 % ( efavirenz ) maintained HIV-1 RNA at < 50 copies/mL at week 96 ( P= .73 ; 95 % CI for the difference , -19 % to 13 % ) . Significant sparing of peripheral lipoatrophy was noted in the lopinavir/ritonavir simplification strategy . This study has provided important information for future studies using treatment simplified to lopinavir/ritonavir monotherapy The introduction of combination antiretroviral therapy for HIV infection revolutionized treatment of AIDS and HIV disease . Such treatment , which usually includes two nucleoside analogue inhibitors of HIV reverse transcriptase and at least one HIV protease inhibitor or non-nucleoside inhibitor of HIV reverse transcriptase , is called highly active antiretroviral therapy ( HAART ) . Early cohort and registry-based studies showed a lower incidence of AIDS and decreased rates of AIDS-related mortality after the introduction of HAART in late 1995 in the United States ( 1 - 3 ) , France ( 4 ) , Australia ( 5 ) , Germany ( 6 ) , and Switzerl and ( 7 ) . However , ecologic studies those that measured use of HAART and mortality in groups rather than in individual patients may be subject to confounding by calendar period changes in other unmeasured variables . In addition , many studies that directly measured the effects of HAART on survival of patients with AIDS did not provide specific evidence of such effects in the patients with the most advanced cases ( 8 - 10 ) , with the exception of small cohorts of patients with cytomegalovirus ( CMV ) retinitis ( 11 ) and progressive multifocal leukoencephalopathy ( 12 ) . We therefore studied the effect of HAART in a large multicenter trial of blood transfusion in patients with advanced HIV disease who were also anemic , an indicator of poor prognosis ( 13 , 14 ) . Because the study spanned the period before and after the introduction of HAART , we could directly assess the effect of HAART status on subsequent death or opportunistic events . In addition , because our person-year analysis was controlled for calendar time , our estimates of the magnitude of the effect of HAART on mortality and morbidity are less likely to be confounded by changes in patient mix or medical practice compared with previously published studies . Methods Patients and Study Design The Viral Activation Transfusion Study ( VATS ) was a multicenter , r and omized , double-blind clinical trial of leukoreduced versus nonleukoreduced red blood cell transfusion in HIV-infected patients who required a first transfusion for anemia . Details of the study design have been published elsewhere ( 15 , 16 ) . In brief , we enrolled patients who were HIV seropositive , were CMV seropositive or had a history of documented CMV disease , had an expected survival of at least 1 month , and required red blood cell transfusion for anemia . Data on clinical end points and prescribed medications were obtained by interview and medical record review at baseline and every 3 months thereafter . Ophthalmologic examinations to detect CMV retinitis were done at baseline and every 6 months thereafter . In patients who did not return for follow-up visits , vital status was ascertained by review of the medical record , a search for a death certificate , and tracing by using public data bases . Of 528 patients , 58 were excluded before death or end of the study , including 29 for whom no follow-up information on opportunistic events was available and 3 for whom follow-up for death but not for opportunistic events was complete . The CD4 + lymphocyte count and plasma HIV RNA level were measured by using frozen blood specimens at the central study laboratory , as described elsewhere ( 15 ) . The study protocol was approved by the institutional review boards of the 11 participating medical centers , and informed consent was obtained from all participants . Definitions of HAART and End Points The VATS did not dictate the choice of antiretroviral therapy for enrolled patients , except that prescription of new antiretroviral drugs was discouraged during the 2 weeks after the first two transfusion episodes . Highly active antiretroviral therapy was defined as prescription of at least three antiretroviral medications with activity against HIV , at least one of which was an HIV protease inhibitor or a non-nucleoside HIV reverse transcriptase inhibitor . Medication history collected at each quarterly visit included the names and start and stop date s of any HIV antiretroviral medications taken since the previous visit or in the 30 days before entry into the study . Patients taking no HIV antiretroviral medications and those taking HIV antiretroviral medications that did not fulfill our definition of HAART were classified as before HAART until the day that they began HAART . Once a patient initiated HAART , all of his or her subsequent follow-up time remained design ated as after HAART even if they discontinued HAART , to approximate an intention-to-treat analysis . We made this conservative decision to avoid overestimating the effect of HAART , which would occur if patients who stopped HAART because of intolerance subsequently contributed outcomes to the before HAART person-years . Adherence to prescribed medications was not measured . End points of the current study were death , opportunistic events , and recurrent transfusions . Opportunistic events were defined a priori and confirmed by expert review ers ; they included clinical diagnoses corresponding to the Centers for Disease Control and Prevention 's AIDS-defining conditions ( 17 ) , with some modifications . Presumptive diagnoses of central nervous system toxoplasmosis were allowed according to the Centers for Disease Control and Prevention 's definition . For CMV retinitis , progression of disease requiring initiation of or a change in anti-CMV medication counted as a clinical event . For analysis , end points were grouped as follows : CMV opportunistic events ; non-CMV opportunistic events ; all opportunistic events ; death ; and any end point , including death and all opportunistic events . We analyzed recurrent red blood cell transfusion ( that is , occurring after the enrollment transfusion event ) as a separate end point . Statistical Analysis We used a person-years approach to analyze event rates . The denominator for these rates was not the individual patient but was person-years of observation time , defined as study follow-up time for each patient classified as before or after initiation of HAART and summed over all patients . For example , patients who started HAART during the study contributed observation time to both the post-HAART and pre-HAART categories , in effect serving as their own controls . Patients who began HAART before study entry and those who never began HAART during the study period contributed only post-HAART or pre-HAART observation time , respectively . Observation time for a patient extended from the date of study entry to death or the end of follow-up . Because some patients had no follow-up information for opportunistic events and a few had less follow-up for opportunistic events than for mortality , the total observation time for opportunistic events was shorter than that for overall survival . Events were assigned to the time period during which they occurred , and incidence was reported as the number of events per person-year of observation time . A single patient could contribute multiple events ( except for the mortality analysis ) to the same or to different time periods . The association between HAART use and incidence rates was expressed as a rate ratio . Confidence intervals and P values were calculated from a Poisson regression model using generalized estimating equations with exchangeable correlation structure to adjust for correlated observations across time within the same patient ( 18 , 19 ) . All P values are two sided . Because the use of HAART increased dramatically over the 4 years of the study , the comparison of post-HAART and pre-HAART data is confounded with calendar time . In addition , the mixture of HIV risk groups , prevalence of prophylaxis against opportunistic infections , diagnostic testing accuracy , and physician experience may have changed over time . To evaluate and control for these and other calendar time effects , each patient 's observation time was further categorized as four 1-year calendar time periods , starting in July 1995 . Similarly , a patient 's time since entering the study was divided into five study time periods ( 0 to 6 , 7 to 12 , 13 to 18 , 19 to 24 , or > 24 months from r and omization ) . The Poisson regression generalized estimating equations method was then used to calculate the rate ratios associated with CD4 T-cell count , plasma HIV RNA level , calendar time , and time on study and to adjust the comparisons of post-HAART and pre-HAART data for these covariates , both in pairs and combined with patient characteristics . Results Participants The VATS enrolled patients with anemia and HIV infection who received a first red blood cell transfusion between August 1995 and July 1998 . Patients were followed until death or their last scheduled quarterly visit before 30 June 1999 . Of 531 patients enrolled in VATS , we excluded 3 patients for whom no data on medication were available ; thus , the sample for analysis in this study was 528 patients . Table 1 shows the baseline characteristics of the study sample . Most patients were 30 to 49 years of age and male , and equal proportions of patients were of white and nonwhite ethnicity . Half of the patients were men who reported sex with other men as their only HIV risk factor ; 19 % each were injection drug users or sexually active heterosexuals without other risk factors for HIV ; and 12 % had multiple or other risk factors . At baseline , the median CD4 + lymphocyte count was 0.015 109 cells/L , and 69 % of patients had a CD4 + lymphocyte count less than 0.050 109 cells/L. The median HIV RNA level was 4.8 log10 copies/mL , and only 8 % of patients had plasma levels of HIV RNA levels less than 200 copies/mL. Table 1 . Baseline Characteristics of 528 Patients in the Viral Activation Transfusion Study At baseline , most patients had previously taken antiretroviral medication ; more than half had taken such medication for 12 months or longer . The proportion of patients taking prophylaxis or treatment of Pneumocystis carinii pneumonia and Mycobacterium avium complex infection decreased from 95 % and 49 % , respectively , in 19951996 to 82 % and 44 % in BACKGROUND Although efavirenz and lopinavir/ritonavir(r ) are both recommended antiretroviral agents in antiretroviral-naïve HIV-infected patients , there are few r and omized comparisons of their efficacy and tolerability . METHODS A multicenter and r and omized study was performed including 126 antiretroviral-naïve patients , r and omly assigned to efavirenz+Kivexa ( n=63 ) or lopinavir/r+Kivexa ( n=63 ) . Efficacy endpoints were the percentage of patients with HIV-RNA < or = 50 copies/mL at week 48 and CD4 recovery . Safety was assessed by comparing toxicity and discontinuations . Statistical analyses were performed on an intention-to-treat ( ITT ) basis ( Missing = Failure ) . RESULTS At week 48 , 56.7 % of patients in the efavirenz and 63.2 % in the lopinavir/r groups showed HIV-1 RNA < 50 copies/mL ( P=0.770 ) ( intention-to-treat analysis ; Missing = Failure ) . Only 1 ( 1.53 % ) patient from each group experienced virological failure . CD4 values increased in both groups ( 298 cells in the efavirenz group , P=0.001 ; 249 cells in the lopinavir/r group , P=0.002 ; P=0.126 between groups ) . HDL-cholesterol only increased in the efavirenz group ( from 39+/-12 mg/dL to 49+/-11 ; P=0.001 ) . Discontinuations were more frequent in the lopinavir/r group ( 36.5 % versus 28.5 % ; P=0.193 ) , but more patients with efavirenz interrupted due to toxicity ( 11.1 % versus 6.3 % ) ; most of them were attributed to hypersensitivity reaction . CONCLUSIONS Similar virological efficacy was observed for efavirenz and lopinavir/r , when administered with Kivexa in antiretroviral-naïve patients , while immunological improvement was slightly superior for efavirenz . The higher rate of discontinuation due to toxicity in the efavirenz group was related to a higher incidence of hypersensitivity reaction . Nowadays , the use of the new formulation of lopinavir/r and the HLA-B*5701 genotype test before starting abacavir should improve the safety profiles of these regimens Abstract Background : The objective of this study was to compare the efficacy of ritonavir boosted atazanavir versus ritonavir boosted lopinavir or efavirenz , all in combination with 2 nucleoside analogue reverse transcriptase inhibitors ( NRTIs ) , over 144 weeks in antiretroviral-naïve HIV-1-infected individuals . Methods : A prospect i ve open-label r and omized controlled trial was conducted at 29 sites in Sweden and Norway between April 2004 and December 2009 . Patients were r and omized to receive either efavirenz 600 mg once daily ( EFV ) , or atazanavir 300 mg and ritonavir 100 mg once daily ( AZV/r ) , or lopinavir 400 mg and ritonavir 100 mg twice daily ( LPV/r ) . The primary endpoints were the proportion of patients with HIV-1 RNA < 50 copies/ml at 48 and 144 weeks . Results : Of 245 patients enrolled , 243 were r and omized and 239 received the allocated intervention : 77 EFV , 81 AZV/r , and 81 LPV/r . Median ( interquartile range ) CD4 cell counts at baseline were 150 ( 80–200 ) , 170 ( 80–220 ) , and 150 ( 90–216 ) per microlitre , respectively . At week 48 the proportion ( 95 % confidence interval ( CI ) ) of patients achieving HIV-1 RNA < 50 copies/ml was 86 (78–94)% in the EFV arm , 78 (69–87)% in the AZV/r arm and , 69 (59–78)% in the LPV/r arm in the intention-to-treat analysis . There was a significant difference between the EFV and LPV/r arm ( p = 0.014 ) . At week 144 , the proportion ( 95 % CI ) of patients achieving HIV-1 RNA < 50 copies/ml was 61 (50–72)% , 58 (47–69)% , 51 (41–63)% , respectively ( p = 0.8 ) . Patients with CD4 cell counts of ≤ 200/μl or HIV-1 RNA > 100,000 copies/ml at baseline had similar response rates in all arms . Conclusion : EFV was superior to LPV/r at week 48 , but there were no significant differences between the 3 arms in the long-term ( 144 weeks ) follow-up Context Postmarketing reports have suggested that efavirenz increases risk for suicide . Contribution In an analysis of data from 4 large , r and omized trials in which patients with HIV were r and omly assigned to either efavirenz-containing or efavirenz-free regimens for initial therapy , efavirenz was associated with a doubling of risk for suicidality ( a composite of suicide , suicide attempt , and suicidal ideation ) . Caution The clinical trials were not specifically design ed to investigate suicidality . Implication An increased risk for suicidality should be considered when choosing efavirenz as part of an initial antiretroviral regimen . The Editors Efavirenz is a preferred nonnucleoside reverse transcriptase inhibitor for treatment of HIV ( 14 ) . Although efavirenz is generally safe and effective , it is associated with central nervous system side effects ( 58 ) ; prescribing information contains warnings of rare but serious psychiatric experiences , including suicide , but also notes that a causal relationship can not be determined from postmarketing reports ( 5 ) . Likewise , published cases and case series report suicidal thoughts or behavior with efavirenz ( 917 ) . A literature review stated that clear evidence of association between efavirenz and suicide was not available and thus psychiatric history should not exclude patients from efavirenz treatment ( 18 ) . Given the widespread use of efavirenz and uncertainty about its relationship to suicide , suicide attempt , or suicidal ideation , we report an AIDS Clinical Trials Group cross- protocol analysis of 4 studies in which participants were r and omly assigned to an initial efavirenz-containing or efavirenz-free antiretroviral regimen . Our primary goal was to compare the hazard of suicidality between participants assigned to an efavirenz-containing versus efavirenz-free antiretroviral regimen for initial treatment of HIV-1 , a potential safety issue not reported in the original studies . Supplement . Original Version ( PDF ) Methods Study Design and Participants Individual-level data from antiretroviral-naive participants with HIV-1 in AIDS Clinical Trials Group studies conducted from 2001 to 2010 that involved r and om assignment to an efavirenz-containing or efavirenz-free regimen were included in this prespecified retrospective cross- study analysis . Four studies met these criteria : A5095 ( Clinical Trials.gov : NCT00013520 ) ( 19 , 20 ) , A5142 ( NCT00050895 ) ( 21 ) , A5175 ( NCT00084136 ) ( 22 ) , and A5202 ( NCT00118898 ) ( 23 ) . Components of the antiretroviral regimen were r and omly assigned , except for the nucleoside analogue choice in A5142 . The studies varied by antiretroviral regimen and slightly by duration and eligibility criteria ( Table 1 ) ; each study excluded participants with substantially abnormal baseline laboratory values . Histories of suicidal ideation or attempt were not exclusion criteria . Studies A5095 and A5202 enrolled participants in the United States and Puerto Rico ; A5142 enrolled participants in the United States and South Africa ; and A5175 enrolled participants from 9 countries in North and South America , Africa , and Asia . Table 1 . Summary of Included Studies * Study protocol s required reporting of signs , symptoms , or diagnoses at each visit , which were recorded with both open-text and data -entry codes . In A5175 , diagnosis reporting instructions included specific codes for suicidality ; the other studies used general psychiatric event codes ( for example , psychiatric disorder , specify ) plus open-text description . Each study required reporting of severe and life-threatening grade d signs or symptoms per the Division of AIDS grading table ( 24 ) , as well as any sign or symptom , regardless of grade , that led to change in study treatment ; diagnoses were not grade d. Further , study A5142 required report of all moderate signs or symptoms , and A5095 and A5202 required report of all moderate central nervous system symptoms . Site institutional review boards approved each study ; participants provided written informed consent . R and omization Each study used permuted-block r and omization ; stratification factors and treatment groups are listed in Table 1 . Efavirenz was formulated as one 600-mg pill given once daily , with three 200-mg pills given initially in A5095 . Efavirenz assignment was open-label in A5142 , A5175 , and A5202 and was blinded and placebo-controlled in A5095 before a data safety monitoring board ( DSMB ) recommendation to unblind efavirenz . The DSMB released recommendations mid- study about inferior efficacy in the efavirenz-free group of A5095 ( 20 February 2003 ) and A5175 ( 23 May 2008 ) , after which participants in the efavirenz-free groups were given the option to switch treatment ( 19 , 22 ) . Outcomes The primary outcome of this cross- study analysis was suicidality , defined as suicidal ideation or attempted or completed suicide and identified from signs , symptoms , diagnoses , adverse events , and death data via Medical Dictionary for Regulatory Activities , version 15.0 . Prespecified Medical Dictionary for Regulatory Activities preferred terms were completed suicide , suicide attempt , intentional overdose , multiple drug overdose intentional , poisoning deliberate , suicidal ideation , suicidal behaviour , and depression suicidal . Attempted or completed suicide was a secondary outcome . Clinical investigators , blinded to treatment and previous adverse events , independently review ed death data categorized as suicide , substance abuse , homicide , accident , unknown cause , or other cause ( for example , infection , cancer , or organ failure ) ; a secondary outcome included suicidality or fatal injury attributed to substance abuse , homicide , or accident . Covariates Each study protocol required report of prescription medication ongoing within 30 days before entry ( denoted recent pre study ) ; pre study psychoactive and antidepressant medications were identified from a medication list on the National Institute of Mental Health Web site ( 25 ) . Psychiatric history was defined as any event in the Medical Dictionary for Regulatory Activities system organ class psychiatric disorders , and depression-related events were classified according to review of psychiatric events data by a psychologist . Pre study psychiatric measures included psychiatric event history , recent psychoactive medication , depression-related event history , and recent antidepressant medication ; presence of event history or pre study medication was combined into 1 covariate . Additional a priori baseline covariates included geographic region , sex , race or ethnic group , age , pretreatment CD4 count , history of AIDS-defining event , and history of injection drug use ( IDU ) ; pretreatment HIV-1 RNA levels , body weight , and body mass index ( BMI ) at study entry were evaluated post hoc ( Appendix Table 1 ) . Analysis of race or ethnic group was limited to white , black , and Hispanic from the United States because of potential social and ethnic differences among countries and low frequencies in other groups and was self-reported and classified according to National Institutes of Health categories . Covariate misclassification was possible ; for example , history of psychiatric events or IDU could have been undisclosed or underreported . Appendix Table 1 . Baseline Characteristics * Data Synthesis and Statistical Analysis The primary analysis approach was intention-to-treat ( ITT ) . Participant-level data were analyzed according to r and omized treatment allocation , with follow-up from r and omization to last on- study contact or death ; all follow-up in A5095 and A5175 was censored after a DSMB recommendation related to the efavirenz comparison ( denoted ITT DSMB ) . In a sensitivity analysis , follow-up included time from r and omization to last on- study contact or death , regardless of DSMB recommendations ( denoted ITT ) ; deaths were summarized using the ITT approach . As-treated analyses excluded participants who never started treatment and included follow-up from treatment initiation through the earliest of the following : discontinuation of the assigned efavirenz-containing or efavirenz-free strategy plus 28 days for washout , discontinuation of all antiretroviral therapy plus 28 days , or last on- study contact ( denoted as-treated ) . A sensitivity approach further censored as-treated follow-up at the time of DSMB recommendations ( denoted as-treated DSMB ) . Antiretroviral modifications were allowed for reasons such as toxicity , virologic failure , or DSMB recommendations . Missing baseline data were rare ( < 1 % ) ; thus , covariate-adjusted analyses used a complete-case approach . Crude incidence rates were calculated as the number of cases per total person-years ( PYs ) at risk , presented as events per 1000 PYs . Incidence rate difference ( IR ) between treatment groups was quantified by a Mantel-Haenszel estimate , stratified by study , with a 95 % CI computed using a rare events variance estimator ( 26 ) . The primary end point , time to suicidality , is presented with cumulative incidence curves and compared between groups with a Gray test ( 27 ) , stratified by study , with nonsuicide death considered a competing risk . Estimated efavirenz and baseline covariate associations were quantified by a hazard ratio ( HR ) from a Cox proportional hazards model , stratified by study . Modification of efavirenz association by covariates was evaluated with interaction terms . The Cox model proportional hazards assumption was evaluated with a piecewise constant hazard with time ( 24 weeks vs. > 24 weeks ) and with a log-transformed time variable . An incidence rate ratio for the efavirenz association was estimated from an exact Poisson model , stratified by study , to evaluate sensitivity of the Cox model to low event frequencies . Analyses were conducted 2-sided with a significance level of 0.05 , without adjustment for multiplicity , in SAS , versions 9.2 or 9.3 ( phreg , genmod ) ( SAS Institute , Cary , North Carolina ) , and in R , version 2.15.1 , competing risks package ( cmprsk ) ( www.r-project.org ) . Role of the Funding Source The National
11,020
25,911,964
The most promising species with antibacterial potential against cariogenic bacteria are : Achillea ligustica , Baccharis dracunculifolia , Croton cajucara , Cryptomeria japonica , Cori and rum sativum , Eugenia caryophyllata , Lippia sidoides , Ocimum americanum , and Rosmarinus officinalis . In some cases , the major phytochemical compounds determine the biological properties of EOs . Menthol and eugenol were considered outst and ing compounds demonstrating an antibacterial potential . Only L. sidoides mouthwash ( 1 % ) has shown clinical antimicrobial effects against oral pathogens thus far .
Dental caries remains the most prevalent and costly oral infectious disease worldwide . Several methods have been employed to prevent this biofilm-dependent disease , including the use of essential oils ( EOs ) . In this systematic review , we discuss the antibacterial activity of EOs and their isolated constituents in view of a potential applicability in novel dental formulations .
Essential oil compounds such as found in thyme extract are established for the therapy of chronic and acute bronchitis . Various pharmacodynamic activities for thyme extract and the essential thyme oil , respectively , have been demonstrated in vitro , but availability of these compounds in the respective target organs has not been proven . Thus , investigation of absorption , distribution , metabolism , and excretion are necessary to provide the link between in vitro effects and in vivo studies . To determine the systemic availability and the pharmacokinetics of thymol after oral application to humans , a clinical trial was carried out in 12 healthy volunteers . Each subject received a single dose of a Bronchipret TP tablet , which is equivalent to 1.08 mg thymol . No thymol could be detected in plasma or urine . However , the metabolites thymol sulfate and thymol glucuronide were found in urine and identified by LC-MS/MS . Plasma and urine sample s were analyzed after enzymatic hydrolysis of the metabolites by headspace solid-phase micro extraction prior to GC analysis and flame ionization detection . Thymol sulfate , but not thymol glucuronide , was detectable in plasma . Peak plasma concentrations were 93.1+/-24.5 ng ml(-1 ) and were reached after 2.0+/-0.8 hours . The mean terminal elimination half-life was 10.2 hours . Thymol sulfate was detectable up to 41 hours after administration . Urinary excretion could be followed over 24 hours . The amount of both thymol sulfate and glucuronide excreted in 24-hour urine was 16.2%+/-4.5 % of the dose Essential oils of many plants have been previously tested in the treatment of oral diseases and other infections . This study was a r and omized , double-blind , in parallel with an active control study , which aim ed to evaluate the efficacy of three formulations of the Lippia sidoides Cham . essential oil ( LSO ) in the reduction of salivary Streptococcus mutans in children with caries . 81 volunteers , aged 6 - 12 years , both genders , with caries , were recruited to participate in this study , and r and omly assigned to either one of five different groups . Each group received topical treatment with either 1.4 % LSO toothpaste , 1.4 % LSO gel , 0.8 % LSO mouthwash , 1 % chlorhexidine gel , or 0.12 % chlorhexidine mouthwash . A 5-ml volume of each gel was placed inside disposable trays , and applied for 1 min , every 24h , for 5 consecutive days . The mouthwash groups used 5-ml volume of a mouthwash inside disposable syringes . In the toothpaste group , children brushed their teeth for 1 min , once a day for 5 days . Saliva was collected before and after treatment . MS colonies were counted , isolated and confirmed through biochemical tests . Differences in MS levels measured in different days within the same treatment group was only verified with LSO toothpaste , chlorhexidine gel and chlorhexidine mouthwash . Comparison between groups of LSO mouthwash , toothpaste and gel showed that the toothpaste group expressed significantly lower MS levels than the mouthwash and gel groups at day-30 . Chlorhexidine significantly reduced MS levels after 5 days of treatment , but these levels returned to baseline in other periods of the study . LSO toothpaste reduced MS levels after 5 days of treatment , and MS levels remained low and did not return to baseline during subsequent analysis . Hence , LSO toothpaste demonstrated the most long-lasting MS reduction in saliva , whereas other LSO formulations did not effectively reduce MS levels in children with dental caries Objectives : The antiplaque and antigingivitis effect of Lippia Sidoides ( LS ) was evaluated in this in vivo investigation . Material and Methods : Twenty-three subjects participated in a cross-over , double-blind clinical study , using 21-day partial-mouth experimental model of gingivitis . A toothshield was constructed for each volunteer , avoiding the brushing of the 4 experimental posterior teeth in the lower left quadrant . The subjects were r and omly assigned initially to use either the placebo gel ( control group ) or the test gel , containing 10 % LS ( test group ) . Results : The clinical results showed statistically significant differences for plaque index ( PLI ) ( p<0.01 ) between days 0 and 21 in both groups , however only the control group showed statistically significant difference ( p<0.01 ) for the bleeding ( IB ) and gingival ( GI ) index within the experimental period of 21 days . On day 21 , the test group presented significantly better results than the control group with regard to the GI ( p<0.05 ) . Conclusions : The test gel containing 10 % LS was effective in the control of gingivitis Several different plant extracts have been evaluated with respect to their antimicrobial effects against oral pathogens and for reduction of gingivitis . Given that a large number of these substances have been associated with significant side effects that contraindicate their long-term use , new compounds need to be tested . The aim of this study was to assess the short-term safety and efficacy of a Lippia sidoides ( " alecrim pimenta")-based essential oil mouthrinse on gingival inflammation and bacterial plaque . Fifty-five patients were enrolled into a pilot , double-blinded , r and omized , parallel-armed study . Patients were r and omly assigned to undergo a 7-day treatment regimen with either the L. sidoides-based mouthrinse or 0.12 % chlorhexidine mouthrinse . The results demonstrated decreased plaque index , gingival index and gingival bleeding index scores at 7 days , as compared to baseline . There was no statistically significance difference ( p>0.05 ) between test and control groups for any of the clinical parameters assessed throughout the study . Adverse events were mild and transient . The findings of this study demonstrated that the L. sidoides-based mouthrinse was safe and efficacious in reducing bacterial plaque and gingival inflammation BACKGROUND To investigate the method ological quality of r and omized controlled trials in three areas of complementary medicine . METHODS The method ological quality of 207 r and omized trials collected for five previously published systematic review s on homeopathy , herbal medicine ( Hypericum for depression , Echinacea for common cold ) , and acupuncture ( for asthma and chronic headache ) was assessed using a vali date d scale ( the Jadad scale ) and single quality items . RESULTS While the method ological quality of the trials was highly variable , the majority had important shortcomings in reporting and /or methodology . Major problems in most trials were the description of allocation concealment and the reporting of drop-outs and withdrawals . There were relevant differences in single quality components between the different complementary therapies : For example , acupuncture trials reported adequate allocation concealment less often ( 6 % versus 32 % of homeopathy and 26 % of herb trials ) , and trials on herbal extracts had better summary scores ( mean score 3.12 versus 2.33 for homeopathy and 2.19 for acupuncture trials ) . Larger trials published more recently in journals listed in Medline and in English language scored significantly higher than trials not meeting these criteria . CONCLUSION Trials of complementary therapies often have relevant method ological weaknesses . The type of weaknesses varies considerably across interventions A novel mouthrinse ( IND 61,164 ) containing essential oils and extracts from four plant species ( Melaleuca alternifolia , Leptospermum scoparium , Calendula officinalis and Camellia sinensis ) were tested . This study aim ed to evaluate the safety , palatability and preliminary efficacy of the rinse . Fifteen subjects completed the Phase I safety study . Seventeen subjects completed the Phase II r and omized placebo-controlled study . Plaque was collected , gingival and plaque indices were recorded ( baseline , 6 weeks , and 12 weeks ) . The relative abundance of two periodontal pathogens ( Actinobacillus actinomycetemcomitans , Tanerella forsythensis ) was determined utilizing digoxigenin-labeled DNA probes . ANCOVA was used at the p = 0.05 level of significance . Two subjects reported a minor adverse event . One subject withdrew from the study . Several subjects objected to the taste of the test rinse but continued treatment . Differences between gingival index , plaque index or relative abundance of either bacterial species did not reach statistical significance when comparing nine placebo subjects with eight test rinse subjects . Subjects exposed to the test rinse experienced no abnormal oral lesions , altered vital signs , changes in liver , kidney , or bone marrow function . Larger scale studies would be necessary to determine the efficacy and oral health benefits of the test rinse BACKGROUND Gingivitis is a chronic inflammatory condition , result ing from gingival bacteria and bacterial byproducts . Antiplaque oral rinses reduce inflammation by removing or inhibiting plaque formation . The purpose of this pilot study was to examine the anti-inflammatory effects of HM-302 , a mouth rinse based on natural products , on gingival inflammation . METHODS A prospect i ve , double-blinded , r and omized parallel-group controlled trial involving 62 patients was conducted to assess efficacy and safety . During a 2-week period with no dental hygiene , subjects were r and omized to receive either the study rinse ( HM-302 ) ; a cetylpyridinium chloride ( CPC ) rinse ; an essential oils ( EO ) rinse ; or a water-only preparation . The gingival index ( GI ) , plaque index ( PI ) , and number of bleeding sites were measured at baseline and at the end of the study period . RESULTS Progression of gingival inflammation result ing from lack of dental hygiene was lowest in patients treated with the HM-302 rinse , and was significantly less marked than in patients treated with the water-only preparation . When compared to the CPC and EO treatments , HM-302 was the only mouth rinse that was significantlybetter than the control , with respect to both the change in absolute GI scores ( p = .006 ) and to the percent increase in GI scores ( p = .012 ) . No serious adverse effects were noted in any of the study groups . CONCLUSION HM-302 is a safe and effective treatment for preventing the development of gingival inflammation in an experimental gingivitis model . Further research is needed to evaluate its long-term effects R and omized , controlled trials ( RCTs ) of herbal interventions often inadequately describe important aspects of their methods ( 1 - 4 ) . Although the quality of reporting of these trials may be improving with time , many still lack important information , particularly about the composition of the herbal intervention ( 4 , 5 ) . Crude herbal drugs are natural products and their chemical composition varies depending on several factors , such as geographic source of the plant material , climate in which it was grown , and time of harvest . Commercially available herbal medicinal products also vary in their content and concentration of chemical constituents from batch to batch and when products containing the same herbal ingredient are compared among manufacturers ( 6 - 14 ) . Even when herbal products are st and ardized for content of known active or marker compounds to achieve more consistent pharmaceutical quality , there is variation in the concentrations of other constituents . These variations can result in differences in pharmacologic activity in vitro ( 15 ) and in bioavailability in humans ( 16 ) . Mindful of these issues , we elaborated on the 22-item checklist of the Consoli date d St and ards of Reporting Trials ( CONSORT ) statement ( 17 ) to help authors and editors improve reporting of RCTs of herbal interventions . Methods We developed these reporting recommendations in 3 phases that included premeeting item generation , a consensus meeting , and postmeeting feedback . The individuals who participated are listed in the Appendix . To generate items , 1 investigator conducted telephone interviews of 16 participants with expertise in the method and reporting of RCTs ( 5 participants ) , pharmacognosy ( 4 participants ) , herbal medicinal products ( 5 participants ) , medical statistics ( 1 participant ) , and herbal product manufacturing ( 1 participant ) . The investigator asked participants to suggest revisions to existing CONSORT checklist items and also to additional items required for reporting trials of herbal interventions . He asked participants to nominate revisions or new items on the basis of empirical evidence that not reporting the item would bias estimates of treatment effect . When no empirical evidence was available , commonsense reasoning was acceptable . After completing all telephone calls , the investigator thematically grouped items and circulated them by e-mail to each participant for review . Fourteen participants attended the consensus meeting . The meeting began with a review of the premeeting checklist item suggestions . We emphasized minimizing item elaborations and additions and basing elaborations on evidence whenever possible . Each item suggestion was presented and followed by debate for its inclusion , deletion , or modification . This process was repeated until all items were review ed and a consensus emerged . After the consensus meeting , we circulated a draft summary report to all participants to ensure that it accurately represented decisions made during the consensus meeting . We then circulated the report to the wider CONSORT Group for input and revised it on the basis of their suggestions . Ethical approval was obtained from The University of Toronto Health Sciences Ethics Review Committee on 23 January 2004 . Financial support for the consensus meeting was provided by the Canadian Institutes of Health Research . The funding body had no role in the design , conduct , or analysis of this study and did not influence the decision to su bmi t the manuscript for publication . All research ers are independent of the funders . Results The group did not recommend any new CONSORT checklist items or modifications in the CONSORT flow diagram . We did , however , elaborate on 9 of the 22 CONSORT checklist items to enhance their relevance to trials of herbal interventions ( Table , Figure ; Appendix Table ) , including minor recommendations for 8 items ( item 1 [ title and abstract ] , item 2 [ background ] , item 3 [ participants ] , item 6 [ outcomes ] , item 15 [ baseline data ] , item 20 [ interpretation ] , item 21 [ generalizability ] , and item 22 [ overall evidence ] ) and detailed recommendations for 1 item ( item 4 [ interventions ] ) . Table . Proposed Elaboration of CONSORT Checklist Item 4 for Reporting R and omized , Controlled Trials of Herbal Medicine Interventions Figure . The high-pressure liquid chromatography chemical fingerprint for the extract of Ginkgo biloba L Appendix Table . Proposed Elaborations of CONSORT Items for R and omized , Controlled Trials of Herbal Medicine Interventions The Table shows the detailed recommendations for item 4 and an example of good reporting related to each recommendation . These recommendations begin with the words where applicable to indicate that all information suggested may not be applicable to every type of herbal medicine intervention . For example , an herbal medicinal product comprising crude herbal material ( for example , leaves and stems ) simply prepared as a tea or decoction does not require description of the type and concentration of solvent used and the ratio of herbal drug to extract ( item 4B.3 ) . Also , not every herbal medicine intervention will have a finished product or extract name or manufacturer ( item 4A.2 ) , but instead may be made by the investigators specifically for the study . In such circumstances , all methods used in preparing and formulating the product must be reported . Similarly , item 4F is not required for herbal interventions when the practitioner is not a part of the intervention . With these exceptions , we recommend that all information shown in the Table be reported for all herbal interventions . Discussion We developed recommendations to be used in conjunction with the existing CONSORT checklist when reporting RCTs of herbal interventions . In particular , we thought it imperative that reports of RCTs provide clear and complete descriptions of the herbal intervention . We think that our recommendations might also be relevant for reporting herbal interventions in other research design s , whether pre clinical ( for example , in vivo or in vitro ) or clinical ( for example , N of 1 trials ) , and refer interested readers to a detailed explanatory document that further describes each of our recommendations and provides additional examples of good reporting ( 22 ) . We hope that authors find our recommendations instructive and that journals will endorse their use and modify their instructions to authors accordingly BACKGROUND The authors conducted a two-week clinical study to determine the remineralizing effect of an experimental mouthrinse containing both fluoride and essential oils in an intraoral caries test model . METHODS The study used an observer-blinded , r and omized , controlled , 3 x 3 crossover design . The authors enrolled in the study 153 subjects , each of whom had a m and ibular removable partial denture . Two partially demineralized human enamel specimens were mounted on each subject 's removable partial denture . Subjects used either a fluoride mouthrinse with essential oils ( the test mouthrinse ) , a fluoride nonessential oils mouthrinse ( the positive control ) or an essential oil nonfluoride mouthrinse ( the negative control ) twice daily for 14 days . The research ers assessed specimens for mineral content change and fluoride uptake using surface microhardness , or SMH , testing and enamel fluoride analysis , respectively . RESULTS Of the 153 subjects enrolled in the study , 125 subjects were evaluable at the study endpoint . The results after two weeks showed that percentage of SMH recovery was 42 percent in the test group , 36 percent in the positive control group and 16 percent in the negative control group . The fluoride uptake was 19 micrograms per square centimeter , 16 microg/cm2 and 3 microg/cm2 for the test mouthrinse , positive control and negative control groups , respectively . In terms of both percentage of SMH and fluoride uptake , the test mouthrinse and positive control mouthrinse were statistically higher than the negative control mouthrinse , and the test mouthrinse was " at least as good as " the positive control mouthrinse . CONCLUSIONS This study provides evidence that an essential oil mouthrinse with 100 parts per million fluoride is effective in promoting enamel remineralization and fluoride uptake . CLINICAL IMPLICATION S The combination of fluoride and essential oils in a mouthrinse may provide anticaries efficacy , in addition to essential oils ' previously established antigingivitis efficacy
11,021
25,756,045
IPC decreases hepatocellular damage in liver surgery as determined by transaminases but does not translate to any significant clinical benefit in orthotopic liver transplant or liver resection . Available clinical evidence does not support routine use of IPC in liver surgery as it does not offer any apparent benefit in perioperative outcome .
BACKGROUND Ischemia-reperfusion injury is a major cause of post-liver-surgery complications . Ischemic preconditioning ( IPC ) has been demonstrated to protect against ischemia-reperfusion injury . Clinical studies have examined IPC in liver surgery but with conflicting results . This systematic review aim ed to evaluate the effects of IPC on outcome in clinical liver surgery .
Objective : To evaluate the protective effects of ischemic preconditioning in a prospect i ve r and omized study involving a large population of unselected patients and to identify factors affecting the protective effects . Summary Background Data : Ischemic preconditioning is an effective protective strategy in several animal models . Protection has also been suggested in a small series of patients undergoing a hemihepatectomy with 30 minutes of inflow occlusion . Whether preconditioning confers protection in other types of liver resection and longer periods of ischemia is unknown . Therefore , we conducted a prospect i ve r and omized study to evaluate the impact of ischemic preconditioning in liver surgery . Methods : A total of 100 unselected patients undergoing major liver resection ( > bisegmentectomy ) under inflow occlusion for at least 30 minutes were r and omized during surgery to either receive or not receive an ischemic preconditioning protocol ( 10 minutes of ischemia followed by 10 minutes of reperfusion ) . Univariate and multivariate analyses were performed to identify independent factors affecting the protective effects of ischemic preconditioning . ATP contents in liver were measured as a possible mechanism of protection . Results : Both groups ( n = 50 in each ) were comparable regarding age , gender , duration of inflow occlusion , and resected liver volumes . Postoperative serum transaminase levels were significantly lower in preconditioned than in control patients ( median peak AST 364 U/L vs. 520 U/L , P = 0.028 ; ALT 406 vs. 519 U/L , P = 0.049 ) . Regression multivariate analysis revealed an increased benefit of ischemic preconditioning in younger patients , in patients with longer duration of inflow occlusion ( up to 60 minutes ) , and in cases of lower resected liver volume ( < 50 % ) . Patients with steatosis were also particularly protected by ischemic preconditioning . ATP content in liver tissue was preserved by ischemic preconditioning in young but not older patients . Conclusions : This study establishes ischemic preconditioning as a protective strategy against hepatic ischemia in humans . The strategy is particularly effective in young patients requiring a prolonged period of inflow occlusion , and in the presence of steatosis , and is possibly related to preservation of ATP content in liver tissue . Other strategies are needed in older patients BACKGROUND Clamping of the portal triad ( Pringle maneuver ) prevents blood loss during liver resection , but leads to liver injury upon reperfusion . Ischemic preconditioning ( IP ) has been shown to protect the liver against prolonged ischemic injury in animal models . However , the clinical value of this procedure has not yet been established . METHODS 61 Patients undergoing hepatic resection under inflow occlusion were r and omized to either to receive ( Group-A n = 30 ) or not to receive ( Group-B n = 31 ) an IP ( 10 minutes of ischemia followed 10 minutes of reperfusion ) . RESULTS Mean ( + /- SD)/ Group-A vs. Group-B. Pringle time of 34 + /- 14 and 33 + /- 12 minutes and the extent of resected liver tissue ( 2.7 + /- 1.3 vs. 2.7 + /- 1.1 segments ) were comparable in both groups . Complications , including death , severe liver dysfunction and biliary leakage occurred in 6 patients of Group-A vs. 14 patients of Group-B ( p<0.05 ) . Intraoperative blood loss was significantly lower in Group-A ( 1.28 + /- 0.91 l vs. 1.94 + /- 0.76 l ; p<0.001 ) with 5 vs. 15 patients requiring transfusions ( p<0.01 ) . In a multivariate analysis the duration of the Pringle maneuver ( p<0.05 ) and the absence of preconditioning ( p<0.05 ) were independent predictors for the occurrence of postoperative complications . CONCLUSIONS IP protects against reperfusion injury , reduces the incidence of complications after hepatic resection under inflow occlusion and is simple to use in clinical practice Background / Purpose Continuous inflow vascular occlusion during liver resections causes less severe ischemia and reperfusion injury ( IRI ) if it is preceded by ischemic preconditioning ( IP ) or if intermittent inflow occlusion is used during the resection . No previous clinical trial has studied the effects of adding IP to intermittent inflow occlusion . Methods Consecutive patients ( n = 32 ) with suspicion of malignant liver disease had liver resections ( minimum 2 segments ) performed with inflow occlusion ( intermittent clamping in a manner of 15 min of ischemia and 5 min of reperfusion repetitively ; 15/5 ) . Half of the patients were r and omized to receive IP ( 10 min of ischemia and 10 min of reperfusion before parenchymal transection ; 10/10 ) . The patients were stratified according to volume of resection and none had chronic liver disease . The patients were followed for 5 days with microdialysis ( μD ) . Results All patients completed the study and there were no deaths . No differences were seen between the groups regarding demographics or perioperative parameters ( bleeding , duration of ischemia , resection volume , complications , and serum laboratory tests ) . There were no differences in alanine aminotransferase ( ALT ) , aspartate aminotransferase ( AST ) , bilirubin , or prothrombin time (PT)-INR levels , but μD revealed lower levels of lactate , pyruvate , and glucose in the IP group having major liver resections ( analysis of variance ; ANOVA ) . Nitrite and nitrate levels in μD decreased postoperatively , but no differences were seen between the groups . In one patient an elevated μD – glycerol curve was seen before the diagnosis of a stroke was made . Conclusions IP before intermittent vascular occlusion does not reduce the serum parameters used to assess IRI . IP seems to improve aerobic glucose metabolism , as the levels of glucose , pyruvate , and lactate locally in the liver were reduced , compared to controls , in patients having > 3 segments resected . μD may be used to monitor metabolism locally BACKGROUND / AIMS Liver pathology induced by chemotherapy ( steatosis or vascular injury ) is known to increase the liver 's sensitivity to ischemia/ reperfusion ( I/R ) injury , thereby increasing morbidity and mortality after liver resection . Our aim was to assess whether ischemic preconditioning ( IP ) reduces I/R injury to livers with chemotherapy-induced pathology . METHODS We analyzed a series of livers from patients treated with chemotherapy for colorectal cancer who underwent IP ( n=30 ) or not ( n=31 ) before hepatectomy . All but one of the livers exhibited chemotherapy-induced steatosis and / or peliosis before the I/R insult . RESULTS Necrosis was less frequent ( p=0.038 ) in livers with IP than in the others . IP had no influence on apoptosis as assessed by terminal transferase uridyl nick-end labeling ( TUNEL ) assay or caspase-3 , -8 and -9 expression . IP induced a twofold increase in B-cell leukemia/ lymphoma 2 ( Bcl-2 ; p<0.05 ) , which was localized to hepatocytes of centrolobular and peliotic areas and colocalized with the autophagy protein beclin-1 in livers with IP , suggesting their coordinated role in autophagy . Increased expression of the phosphorylated Bcl-2 was observed in preconditioned livers and was associated with a decreased immunoprecipitation of beclin-1 and the increased expression of light chain 3 type II ( LC3-II ) . The increased number of autophagic vacuoles seen by electron microscopy confirmed an association of autophagy in chemotherapy-injured livers following IP . However , the differences in protein expression were not reflected in postresection liver-injury tests or measure of patient morbidity . CONCLUSIONS IP is associated with a reduction in necrosis of hepatocytes already damaged by chemotherapy and an activation of autophagy . Bcl-2 and beclin-1 could be major targets in the regulation of cell death during I/R injury BACKGROUND Two r and omized prospect i ve studies suggested that ischemic preconditioning ( IP ) protects the human liver against ischemia-reperfusion injury after hepatectomy performed under continuous clamping of the portal triad . The primary goal of this study was to determine whether IP protects the human liver against ischemia-reperfusion injury after hepatectomy under continuous vascular exclusion with preservation of the caval flow . STUDY DESIGN Sixty patients were r and omly divided into two groups : with ( n=30 ; preconditioning group ) and without ( n=30 ; control group ) IP ( 10 minutes of portal triad clamping and 10 minutes of reperfusion ) before major hepatectomy under vascular exclusion of the liver preserving the caval flow . Serum concentrations of aspartate transferase , alanine transferase , glutathione-S-transferase , and bilirubin and prothrombin time were regularly determined until discharge and at 1 month . Morbidity and mortality were determined in both groups . RESULTS Peak postoperative concentrations of aspartate transferase were similar in the groups with and without IP ( 851 + /- 1,733 IU/L and 427 + /- 166 IU/L respectively , p=0.2 ) . A similar trend toward a higher peak concentration of alanine transferase and glutathione-S-transferase was indeed observed in the preconditioning group compared with the control group . Morbidity and mortality rates and lengths of ICU and hospitalization stays were similar in both groups . CONCLUSIONS IP does not improve liver tolerance to ischemia-reperfusion after hepatectomy under vascular exclusion of the liver with preservation of the caval flow . This maneuver does not improve postoperative liver function and does not affect morbidity or mortality rates . The clinical use of IP through 10 minutes of warm ischemia in this technique of hepatectomy is not currently recommended Hepatic pedicle clamping ( HPC ) is widely used to control intraoperative bleeding during hepatectomy ; intermittent HPC is better tolerated but is associated with blood loss during each period of reperfusion . Recently , it has been shown that ischemic preconditioning ( IP ) reduces the ischemia-reperfusion damage for up to 30 minutes of continuous clamping in healthy liver . We evaluated the safety of IP for more prolonged periods of continuous clamping in 42 consecutive patients with healthy liver su bmi tted to hepatectomy . IP was used in 21 patients ( group A ) ; mean + /- SD of liver ischemia was 54 + /- 19 minutes ( range , 27 - 110 ; in 7 cases > 60 minutes ) . In the other 21 patients , continuous clamping alone was used ( Group B ) ; liver ischemia lasted 36 + /- 14 minutes ( range , 13 - 70 ; in 2 cases > 60 minutes ) . Two patients in Group A ( 9.5 % ) and 3 in Group B ( 14.2 % ) received blood transfusions . In spite of the longer duration of ischemia ( P=.001 ) , patients with IP had lower aspartate aminotransferase ( AST ; P=.03 ) and alanine aminotransferase ( ALT ; P = not significant ) at postoperative day 1 , with a similar trend at postoperative day 3 . This was reconfirmed by multiple regression analysis , which showed that although postoperative transaminases increased with increasing duration of ischemia and of the operation in both groups , the increases were significantly smaller ( P<.001 ) with the use of preconditioning . In conclusion , the present study confirms that IP is safe and effective for liver resection in healthy liver and is also better tolerated than continuous clamping alone for prolonged periods of ischemia . This technique should be preferred to continuous clamping alone in healthy liver . Additional studies are needed to assess the role of IP in cirrhotic liver and to compare IP with intermittent clamping BACKGROUND The Pringle manoeuvre and ischaemic preconditioning are applied to prevent blood loss and ischaemia-reperfusion injury , respectively , during liver surgery . In this prospect i ve clinical trial we report on the intraoperative haemodynamic effects of the Pringle manoeuvre alone or in combination with ischaemic preconditioning . METHODS Patients ( n=68 ) were assigned r and omly to three groups : ( i ) resection with the Pringle manoeuvre ; ( ii ) with ischaemic preconditioning before the Pringle manoeuvre for resection ; ( iii ) without pedicle clamping . RESULTS Following the Pringle manoeuvre the mean arterial pressure increased transiently , but significantly decreased after unclamping as a result of peripheral vasodilation . Ischaemic preconditioning improved cardiovascular stability by lowering the need for catecholamines after liver reperfusion without affecting the blood sparing benefits of the Pringle manoeuvre . In addition , ischaemic preconditioning protected against reperfusion-induced tissue injury . CONCLUSIONS Ischaemic preconditioning provides both better intraoperative haemodynamic stability and anti-ischaemic effects thereby allowing us to take full advantage of blood loss reduction by the Pringle manoeuvre Ischemic preconditioning ( IPC ) has the potential to decrease graft injury and morbidity after liver transplantation . We prospect ively investigated the safety and efficacy of 5 minutes of IPC induced by hilar clamping in local deceased donor livers r and omized 1:1 to st and ard ( STD ) recovery ( N = 28 ) or IPC ( N = 34 ) . Safety was assessed by measurement of heart rate , blood pressure , and visual inspection of abdominal organs during recovery , and efficacy by recipient aminotransferases ( aspartate aminotransferase [ AST ] and alanine aminotransferase [ ALT ] , both measured in U/L ) , total bilirubin , and international normalized ratio of prothrombin time ( INR ) after transplantation . IPC performed soon after laparotomy did not cause hemodynamic instability or visceral congestion . Recipient median AST , median ALT , and mean INR , in STD vs. IPC were as follows : day 1 AST 696 vs. 841 U/L ; day 3 AST 183 vs. 183 U/L ; day 1 ALT 444 vs. 764 U/L ; day 3 ALT 421 vs. 463 U/L ; day 1 INR 1.7 + /- .4 vs. 2.0 + /- .8 ; and day 3 INR 1.3 + /- .2 vs. 1.4 + /- .3 ; all P > .05 . No instances of nonfunction occurred . The 6-month graft and patient survival STD vs. IPC were 82 vs. 91 % and median hospital stay was 10 vs. 8 days ; both P > .05 . In conclusion , deceased donor livers tolerated 5 minutes of hilar clamping well , but IPC did not decrease graft injury . Further trials with longer periods of preconditioning such as 10 minutes are needed Background Extensive experimental studies and a few clinical series have shown that ischemic preconditioning ( IPC ) attenuates oxidative ischemia/reperfusion ( I/R ) injuries in liver resections performed under inflow vascular control . Selective hepatic vascular exclusion ( SHVE ) employed during hepatectomies completely deprives the liver of blood flow , as it entails simultaneous clamping of the portal triad and the main hepatic veins . The aim of the present study was to identify whether IPC can also protect hepatocytes during liver resections performed under SHVE . Methods Patients undergoing major liver resection were r and omly assigned to have either only SHVE ( control group , n = 43 ) or SHVE combined with IPC—10 min of ischemia followed by 15 min of reperfusion before SHVE was applied ( IPC group , n = 41 ) . Results The two groups were comparable with regard to age , liver resection volume , blood loss and transfusions , warm ischemic time , and total operative time . In liver remnant biopsies obtained 60 min post-reperfusion , IPC patients had significantly fewer cells stained positive by TUNEL compared to controls ( 19 % ± 8 % versus 45 % ± 12 % ; p < 0.05 ) . Also IPC patients had attenuated hepatocyte necrosis , systemic inflammatory response , and oxidative stress as manifested by lower postoperative peak values of aspartate transaminase , interleukin-6 , interleukin-8 , and malondialdehyde compared to controls . Morbidity was similar for the two groups , as were duration of intensive care unit stay and extent of total hospital stay . Conclusions In major hepatectomies performed under SHVE , ischemic preconditioning appears to attenuate apoptotic response of the liver remnant , possibly through alteration of inflammatory and oxidative pathways The effect of ischemic preconditioning ( IPC ) in orthotopic liver transplantation ( OLT ) has not yet been clarified . We performed a pilot study to evaluate the effects of IPC in OLT by comparing the outcomes of recipients of grafts from deceased donors r and omly assigned to receive ( IPC+ group , n = 23 ) or not ( IPC- group , n = 24 ) IPC ( 10-min ischemia + 15-min reperfusion ) . In 10 cases in the IPC+ group and in 12 in the IPC- group , the expression of inducible nitric oxide synthase ( iNOS ) , neutrophil infiltration , and hepatocellular apoptosis were tested by immunohistochemistry in prereperfusion and postreperfusion biopsies . Median aspartate aminotransferase ( AST ) levels were lower in the IPC+ group vs. the IPC- group on postoperative days 1 and 2 ( 398 vs. 1,234 U/L , P = 0.002 ; and 283 vs. 685 U/L , P = 0.009 ) . Alanine aminotransferases were lower in the IPC+ vs. the IPC- group on postoperative days 1 , 2 , and 3 ( 333 vs. 934 U/L , P = 0.016 ; 492 vs. 1,040 U/L , P = 0.008 ; and 386 vs. 735 U/L , P = 0.022 ) . Bilirubin levels and prothrombin activity throughout the first 3 postoperative weeks , incidence of graft nonfunction and graft and patient survival rates were similar between groups . Prereperfusion and postreperfusion immunohistochemical parameters did not differ between groups . iNOS was higher postreperfusion vs. prereperfusion in the IPC- group ( P = 0.008 ) . Neutrophil infiltration was higher postreperfusion vs. prereperfusion in both groups ( IPC+ , P = 0.007 ; IPC- , P = 0.003 ) . Prereperfusion and postreperfusion apoptosis was minimal in both groups . In conclusion , IPC reduced ischemia/reperfusion injury through a decrease of hepatocellular necrosis , but it showed no clinical benefits While animal studies show that ischemic preconditioning ( IPC ) is beneficial in liver transplantation ( LT ) , evidence from few smaller clinical trials is conflicting . From October 2003 to July 2006 , 101 deceased donors ( DD ) were r and omized to 10 min IPC ( n = 50 ) or No IPC ( n = 51 ) . Primary objective was efficacy of IPC to decrease reperfusion ( RP ) injury . Both groups had similar donor risk index ( DRI ) ( 1.54 vs. 1.57 ) . Aminotransferases on days 1 and 2 were significantly greater ( p < 0.05 ) in IPC recipients . In multivariate analyses , IPC had an independent effect only on day 2 aspartate transferase . Prothrombin time , bilirubin and histological injury were similar in both groups . IPC had no significant effect on plasma TNF‐α , IL‐6 and IL‐10 in the donor and TNF‐α and IL‐6 in the recipient . In contrast , IPC recipients had a significant rise in systemic IL‐10 levels after RP ( p < 0.05 ) and had fewer moderate/severe rejections within 30 days ( p = 0.09 ) . Hospital stay was similar in both groups . One‐year patient and graft survival in IPC versus No IPC were 88 % versus 78 % ( p = 0.1 ) and 86 versus 76 % ( p = 0.25 ) , respectively . IPC increases RP injury after DDLT , an ‘ IPC paradox ’ . Other potential benefits of IPC are limited . IPC may be more effective in combination with other preconditioning regimens The aim of the study was to evaluate safety and efficacy of IP in LT , particularly in marginal grafts . From 2007 to 2008 , 75 LT donors were r and omized to receive IP ( IP+ ) or not ( IP– ) . Considering the graft quality , we divided the main groups in two subgroups ( marg+/marg– ) . IP was performed by 10‐min inflow occlusion ( Pringle maneuver utilizing a toruniquet ) . Donor variables considered were gender , age , AST/ALT , ischemia time and steatosis . Recipient variables were gender , age , indication to LT and MELD/CHILD/UNOS score . AST/ALT levels , INR , bilirubin , lactic acid , bile output on postoperative days 1 , 3 and 7 were evaluated . Histological analysis was performed evaluating necrosis/steatosis , hepatocyte swelling , PMN infiltration and councilman bodies . Thirty patients received IP+ liver . No differences were seen between groups considering recipient and donor variables . Liver function and AST/ALT levels showed no significant differences between the main two groups . Marginal IP+ showed lower AST levels on day1 compared with untreated marginal livers ( 936.35 vs. 1268.23 ; p = 0.026 ) . IP+ livers showed a significant reduction of moderate‐severe hepatocyte swelling ( 33.3 % vs. 65.9 % ; p = 0.043 ) . IP+ patients had a significant reduction of positive early microbiological investigations ( 36.7 % vs. 57.1 % ; p = 0.042 ) . In our experience IP was safe also in marginal donors , showing a protective role against IRI AIM To investigate the protective effect of ischemic preconditioning ( IPC ) on hepatocellular carcinoma ( HCC ) patients with cirrhosis undergoing hepatic resection under hepatic inflow occlusion ( HIO ) and its possible mechanism . METHODS Twenty-nine consecutive patients with resectable 0HCC were r and omized into two groups : IPC group : before HIO , IPC with 5 min of ischemia and 5 min of reperfusion was given ; control group : no IPC was given . Liver functions , hepatic Caspase-3 activity , and apoptotic cells were compared between these two groups . RESULTS On postoperative days ( POD ) 1 , 3 and 7 , the aspartate transaminase ( AST ) and alanine transaminase ( ALT ) levels in the IPC group were significantly lower than those in the control group ( P<0.05 ) . On POD 3 and 7 , the total bilirubin level in the IPC group was significantly lower than that in the control group ( P<0.05 ) . On POD 1 , the albumin level in the IPC group was higher than that in the control group ( P = 0.053 ) . After 1 h of reperfusion , both hepatic Caspase-3 activity and apoptotic sinusoidal endothelial cells in the IPC group were significantly lower than those in the control group ( P<0.05 ) . CONCLUSION IPC has a potential protective effect on HCC patients with cirrhosis . Its protective mechanism underlying the suppression of sinusoidal endothelial cell apoptosis is achieved by inhibiting Caspase-3 activity To assess the immediate and long‐term effects of ischemic preconditioning ( IPC ) in deceased donor . liver transplantation ( LT ) , we design ed a prospect i ve , r and omized controlled trial involving 60 donors : control group ( CTL , n = 30 ) or study group ( IPC , n = 30 ) . IPC was induced by 10‐min hiliar clamping immediately before recovery of organs . Clinical data and blood and liver sample s were obtained in the donor and in the recipient for measurements . IPC significantly improved biochemical markers of liver cell function such as uric acid , hyaluronic acid and Hypoxia‐Induced Factor‐1alpha ( HIF‐1α ) levels . Moreover , the degree of apoptosis was significantly lower in the IPC group . On clinical basis , IPC significantly improved the serum aspartate aminotransferase ( AST ) levels and reduced the need for reoperation in the postoperative period . Moreover , the incidence of primary nonfunction ( PNF ) was lower in the IPC group , but did not achieve statistical significance . We conclude that 10‐min IPC protects against I/R injury in deceased donor LT HYPOTHESES Temporary vascular clampage ( Pringle maneuver ) during liver surgery can cause ischemia-reperfusion injury . In this process , activation of polymorphonuclear leukocytes ( PMNLs ) might play a major role . Thus , we investigated the effects of hepatic ischemic preconditioning on PMNL functions . DESIGN Prospect i ve r and omized study . Patients who underwent partial liver resection were r and omly assigned to 3 groups : group 1 without Pringle maneuver ; group 2 with Pringle maneuver , and group 3 with ischemic preconditioning using 10 minutes of ischemia and 10 minutes of reperfusion prior to Pringle maneuver for resection . SETTING University hospital , Munich , Germany . PATIENTS Seventy-five patients underwent hepatic surgery mostly owing to metastasis . MAIN OUTCOME MEASURES Perioperative factors for PMNL activation , inflammation , and postoperative hepatocellular integrity . RESULTS Ischemia-reperfusion of the human liver ( mean + /- SD time to perform the Pringle maneuver , 35.5 + /- 2.6 minutes ) caused ( 1 ) a decrease in the number of circulating PMNLs , ( 2 ) their intrahepatic sequestration , ( 3 ) their systemic activation , and ( 4 ) a significant correlation between the degree of their postischemic activation and the postoperative rise in liver enzyme serum levels . In parallel , cytokines with proinflammatory and chemotactic properties were released reaching the highest values when stimulation of PMNLs was most pronounced . When ischemic preconditioning preceded the Pringle maneuver , activation of PMNLs and cytokine plasma levels was reduced as evidence d by the attenuation of superoxide anion production , beta(2)-integrin up-regulation , and interleukin 8 serum concentrations , followed by a significant reduction in serum alanine aminotransferase levels on the first and second postoperative days . CONCLUSIONS These results demonstrate in humans that ischemic preconditioning reduces activation of PMNLs elicited by the Pringle maneuver . The down-regulation of potentially cytotoxic functions of PMNLs might be one of yet unknown important pathways that altogether mediate protection by ischemic preconditioning BACKGROUND / AIMS The efficacy of ischemic preconditioning ( IPC ) in preventing reperfusion injury in human liver transplants is still question ed . Phosphoinositide-3-kinase ( PI3 K ) is essential for IPC development in rodent livers . This work investigates whether PI3K-dependent signals might account for the inconsistent responses to IPC of transplanted human livers . METHODS Forty livers from deceased donors were r and omized to receive or not IPC before recovery . PI3 K activation was evaluated in biopsies obtained immediately before IPC and 2 h after reperfusion by measuring the phosphorylation of the PI3 K downstream kinase PKB/Akt and the levels of the PI3 K antagonist phosphatase tensin-homologue deleted from chromosome 10 ( PTEN ) . RESULTS IPC increased PKB/Akt phosphorylation ( p = 0.01 ) and decreased PTEN levels ( p = 0.03 ) in grafts , but did not significantly ameliorate post-transplant reperfusion injury . By calculating T(2h)/T(0 ) PKB/Akt phosphorylation ratios , 10/19 ( 53 % ) of the preconditioned grafts had ratios above the control threshold ( IPC-responsive ) , while the remaining nine grafts showed ratios comparable to controls ( IPC-non-responsive ) . T(2h)/T(0 ) PTEN ratios were also decreased ( p < or = 0.03 ) only in IPC-responsive grafts . The patients receiving IPC-responsive organs had ameliorated ( p < or = 0.05 ) post-transplant aminotransferase and bilirubin levels , while prothrombin activity was unchanged . CONCLUSIONS Impaired PI3 K signaling might account for the variability in the responses to IPC of human grafts from deceased donors
11,022
24,302,533
Bleeding associated with surgery of the cervix appears to be reduced by vasopressin , used in combination with local anaesthetic . Tranexamic acid appears to be beneficial after knife and laser cone biopsy . There is some evidence that haemostatic suturing has an adverse effect on blood loss , cervical stenosis and satisfactory colposcopy
BACKGROUND Cervical intraepithelial neoplasia ( CIN ) is the most common pre-malignant lesion . Surgical treatments for CIN are commonly associated with blood loss . OBJECTIVES To assess the effectiveness and safety of interventions for preventing blood loss during the treatment of CIN .
OBJECTIVE To evaluate the ability of Amino-Cerv to promote healing of the cervix following loop electrical excision procedure ( LEEP ) of the cervix for cervical intraepithelial neoplasia . STUDY DESIGN A r and omized study was conducted on 48 women in a private office setting . Patients were divided into two groups , to use or not use Amino-Cerv intravaginally for two weeks after the procedure . Statistical analysis of the findings were performed using the chi 2 test . RESULTS Twenty of 24 ( 83 % ) of women using Amino-Cerv had completely healed tissue at four weeks , whereas 12 of 24 ( 50 % ) in the untreated group had healed tissue at four weeks . Findings were not statistically significant at two weeks but were statistically significant ( P < .005 , power = .55 ) at four weeks after the procedure . CONCLUSION Amino-Cerv after LEEP of the cervix promotes healing Summary . Fifty women undergoing laser vaporization of the cervix for cervical intraepithelial neoplasia were r and omly allocated to one of two groups . In one group the patients received no anaesthesia , in the other the ectocervix was infiltrated with 2 ml of Citanest with Octapressin ( prilocaine 3 % with 0.03 i.u./ml . of felypressin ) immediately before the procedure . The pain experienced by each group was assessed immediately after treatment by visual analogue and verbal rating scales . The pain experienced by those women receiving local anaesthesia was significantly reduced as assessed by the visual analogue scale ( P = 0.011 ) and this reduction was not quite significant by the verbal rating scale ( P = 0.06 ) . The Citanest group had less troublesome bleeding but the difference in bleeding between the two groups was not significant Objective To test the hypothesis that prilocaine with felypressin causes fewer side effects than lignocaine with adrenaline when performing large loop excision of the transformation zone of the cervix Abstract . Laser Conization of the cervix was performed in both inpatient and outpatient setting s with either local or general anesthesia . All of the patients included had abnormal cervical smears , abnormal colposcopic findings and were allocated to one of two groups , A and B. Patients in group A had general anesthesia while patients in group B had only local anesthesia . A st and ard operative technique , was used and all patients had estimation of blood loss , recording of operative time , surgical suite time , anesthesia induction time , and assessment of postoperative pain and morbidity . Statistical analysis was performed using the student t-test . We concluded that laser conization of the cervix can be performed more cheaply with local anesthesia than with general anesthesia and with little discomfort , less nausea , and vomitting OBJECTIVE The purpose of this study was to compare Monsel 's paste with fulguration with ball electrode for hemostasis after loop electrosurgical excision procedure . STUDY DESIGN One hundred healthy women were assigned r and omly by computer-generated r and om numbers to ball electrode or thickened Monsel 's paste for hemostasis after loop electrosurgical excision procedure . Patients rated pain during hemostasis using a visual analog scale . At 2 weeks , postprocedural vaginal discharge was rated on a Likert scale . Pathology was review ed for dysplasia grade and margin status . Recurrent dysplasia on repeat Papanicolaou tests was noted . RESULTS Six patients ( 2 Monsel 's and 4 fulguration ) required an alternate method of hemostasis . Patient demographics , postprocedural discharge , and recurrent dysplasia were comparable between the 2 groups . Visual analog scale scores and hemostasis time were significantly higher in the fulguration group . Estimated blood loss , although higher in the fulguration group , was not significant between groups . CONCLUSION Monsel 's paste and fulguration with ball electrode appear be equally effective as hemostatic agents after loop electrosurgical excision procedure Objective To investigate whether central diathermy ball cauterization after loop excision affects satisfactory colposcopy at follow-up . Methods One hundred one consecutive women with the squamocolumnar junction visible at the ectocervix scheduled for loop excision were assigned alternately into two groups . In group A , diathermy ball cauterization was applied to the entire crater following excision . In group B , cauterization was avoided in a 2–3-mm zone around the new os . The women were re-examined 4 months postoperatively by colposcopy and microcolpohysteroscopy with specific intention to identify the location of the squamocolumnar junction . The examiners performing colposcopy and microcolpohysteroscopy were not aware of each other 's interpretation , or of the method of cauterization used . Results Follow-up colposcopy was satisfactory in 12 women in group A ( 24 % ) and 47 women in group B ( 92.2 % ) ( P < .001 ) . Forty-three women ( 86 % ) in group A and ten in group B ( 19.6 % ) had the squamocolumnar junction partly or fully located within the cervical canal ( P < .001 ) . Microcolpohysteroscopy located the squamocolumnar junction at a mean depth of 4.5 ± 2.4 mm ( ± st and ard deviation [ SD ] ) in the women in group A and 1 ± 0.9 mm in group B ( P < .001 ) . Microcolpohysteroscopy could not be performed in 13 women in group A ( 26 % ) and one woman in group B ( 2 % ) ( P < .001 ) . Conclusion Diathermy ball cauterization at the new cervical os after loop excision results in a shift of the squamocolumnar junction toward the endocervical canal , and predisposes to cervical stenosis , thereby decreasing satisfactory colposcopy rates One hundred forty patients who underwent laser conization and 220 patients who underwent laser miniconizations were prospect ively r and omized into two study groups . One treatment group was given antifibrinolytic therapy in the form of tranexamic acid ( Cyklokapron , KabiVitrum , Sweden ) intraoperatively and for 14 days postoperatively . The other group did not receive antifibrinolytic therapy . In the group of 68 patients with laser conizations who were given antifibrinolytic therapy , no postoperative hemorrhages occurred , whereas there were eight such hemorrhages in 72 conizations ( 11 % ) in the untreated patients . This difference is statistically significant ( P = .004 , Fisher exact test for two proportions ) . Also , for laser miniconization , the frequency of postoperative hemorrhage was almost halved , from 9.1 % in 110 patients not receiving antifibrinolytic therapy to 5.5 % in the 110 treated patients . The use of lateral cervical sutures did not reduce the frequency of postoperative hemorrhage at laser conization in the present study Two methods of obtaining hemostasis after cold knife cone biopsy were compared in a prospect i ve r and omized trial involving 200 patients . One method relied primarily on hemostatic sutures , and the other involved the use of a styptic solution ( Monsel 's solution ) and vaginal pack , thus avoiding the use of sutures altogether . The short- and long-term morbidity in these two groups were compared and 12-month follow-up was completed . The use of sutures did not reduce the incidence of primary hemorrhage . Secondary hemorrhage was twice as frequent in the suture group , although this trend did not quite reach statistical significance . During long-term follow-up , significantly more patients in the suture group developed menstrual symptoms , cervical stenosis , and unsatisfactory colposcopy , requiring further operative intervention as a result Since its initial description by Prendiville et al. ( 1989 ) , loop diathermy excision of the transformation zone has become a common method for the local treatment of colposcopically detected abnormalities of the transformation zone . A number of studies including Luesley et al. ( 1990,1992 ) , have analysed patient symptoms retrospectively and noted that these were in the main related to vaginal discharge and bleeding of short duration . We report here a prospect i ve study objective ly documenting patient bleeding and discharge following loop diathermy . Monsel ’s solution has been shown to be an effective haemostat at cone biopsy ( Gilbert etal . 1989 ) ; we determined to see if Monsel ’s solution applied following loop diathermy would reduce vaginal discharge A double blind r and omized trial was made to ascertain the effect of tranexamic acid ( AMCA ) ( Cyklo‐kapron ® ) on the postoperative blood loss after conization . The case material consisted of 50 women referred to the clinic because of dysplasia or non‐invasive cancer of the cervix . Five patients were excluded for various reasons . The treatment started in the evening of the day of operation and was continued for another 12 days , the dose being three tablets every 8 hours , corresponding to 4.5 g of tranexamic acid daily when the active drug was given . During the first 7 postoperative days , when the patients were in hospital , the blood losses were determined quantitatively . Prophylactic treatment with tranexamic acid reduced the postoperative blood loss as compared with the placebo group , the blood losses being 23 ± 3.2 ml and 79 ± 20.4 ml respectively . Sudden profuse bleeding postoperatively , requiring remedial measures , occurred in 7 patients , all in the placebo group . With the exception of 1 patient in the placebo group , who complained of nausea , no side effects were recorded Objective To evaluate whether routinely giving an antibiotic after loop diathermy excision of the cervical transformation zone reduced post‐operative vaginal loss OBJECTIVE To estimate the perioperative or postoperative bleeding rates after treatment of cervical intraepithelial neoplasia by loop electrosurgical excision procedure in either the follicular or luteal phase of the menstrual cycle . METHODS A r and omized controlled trial was carried out to compare the outcomes in terms of primary and secondary hemorrhage between patients treated by loop electro‐surgical excision procedure during either the follicular ( 30 women ) or luteal phase ( 30 women ) of the menstrual cycle . The two groups did not differ in terms of mean age , grade of cervical intraepithelial neoplasia , depth of excision , parity , and duration of menses . Primary outcome measures included the objective and subjective assessment of intra‐operative and postoperative bleeding . RESULTS Women treated during the luteal phase of the menstrual cycle experienced significantly more postoperative bleeding than women treated during the follicular phase , as assessed by the fall in hematocrit levels ( P < .001 ) and subjective reports . Intraoperative bleeding was judged to be more severe in women treated during the luteal phase of the cycle by a single , blinded colposcopist ( P = .02 ) . These women also experienced higher levels of anxiety postoperatively , which result ed in more consultations with medical staff ( P = .007 ) . CONCLUSION The use of loop electrosurgical excision procedure to treat cervical intraepithelial neoplasia results in less bleeding if performed during the follicular phase of the menstrual cycle Abstract . To evaluate the inhibitory effect of tranexamic acid ( AMCA ) on increased fibrinolyctical activity in connection with conisatio colli uteri we have carried out a r and omized , double‐blind study with patients operated upon partly with an open method of operation ( 80 patients ) and partly with modified Sturmdorff sutures ( 150 patients ) . In connection with the open method the frequency of late bleeding decreased from . 17.5 % to 2.5 % , which is significant . The corresponding decrease in connection with the other method was from 10.7 % to 4.1 % . This material was not sufficiently large to verify significance . The side effects of the prophylactic treatment with AMCA were few
11,023
27,820,505
Adverse events are a persistent and an important reason for admission to the ICU . However , there is relatively weak evidence to estimate an overall incidence and preventability rate of these events . IMPLICATION S FOR PRACTICE Unplanned intensive care admission within 24 hours of a procedure with an anesthetist in attendance ( UIA ) is a recommended clinical indicator in surgical patients .
BACKGROUND Adverse events are unintended patient injuries or complications that arise from healthcare management result ing in death , disability or prolonged hospital stay . Adverse events that require critical care are a considerable financial burden to the healthcare system . Medical record review seems to be a reliable method for detecting adverse events . OBJECTIVES To synthesize the best available evidence regarding the estimates of the incidence and preventability of adverse events that necessitate intensive care admission ; to determine the type and consequences ( patient harm , mortality , length of ICU stay and direct medical costs ) of these adverse events .
Context The Institute of Medicine report that caused concern about adverse events in hospitals relied on studies that used medical record review to identify adverse events . Many people question whether medical record review s can accurately identify negligent adverse events . Contribution The following changes in the review process markedly reduced the rates of negligent adverse events : 1 ) increasing the number of review ers from one to three and 2 ) requiring review ers to be highly confident that an event was due to negligence . Implication s Because review criteria can affect adverse event rates , the estimates cited by the Institute of Medicine could be inaccurate . The Editors Implicit judgments , based on medical record review , about adverse events caused by medical care have moderate to poor inter-rater reliability ( 1 - 7 ) . However , the influential report on medical errors issued by the U.S. Institute of Medicine ( IOM ) in 2000 ( 8) relied on studies based on medical record review to estimate that 44 000 ( 9 ) to 98 000 ( 10 ) Americans die each year because of medical errors . These studies used one ( 9 ) or up to three ( 10 ) physician review ers per record to detect adverse events . The IOM estimates are important because they have prompted health care providers and administrators to reduce errors and have influenced the U.S. research budget . We report details on the reliability of physician judgments about adverse events and negligent adverse events in the study from which the IOM derived its estimate of 44 000 error-related deaths per year ( 9 ) . We also report the effects of varying criteria for review er confidence in and review er agreement about the presence of adverse events . Methods Data Sources This study was conducted as part of the Utah and Colorado Medical Practice Study ( UCMPS ) ( 5 ) . In UCMPS , trained nurses review ed medical records to identify 1 of 19 screening criteria that could indicate the presence of an adverse event . A trained physician then review ed flagged records by using a structured chart abstract ion form . Previously , the nurse review process was found to be a good screening tool , with a sensitivity of 84 % ( 1 ) . We began the current study with all 2868 records referred for physician review by nurse screeners from the original 15 000-record sample ( 5000 from Utah and 10 000 from Colorado ) ( Appendix Figure ) . Two physician investigators confirmed all adverse events detected during this initial review . As a result of this confirmation process , we eliminated 13 false-positive cases . We eliminated cases only if they failed to meet explicit criteria for an adverse event , not if there was a concern about the review er 's judgment regarding whether medical management caused the adverse event . Next , we r and omly selected 500 of the 2868 records referred for physician review by nurse screeners ( 167 from Utah and 333 from Colorado ) , maintaining the original 1 to 2 ratio of Utah to Colorado records ( Appendix Figure ) . The general characteristics of this sample are described elsewhere ( 5 ) . Of these 500 records , 400 were r and omly sample d from referred records that a single physician review er had previously judged to show no adverse events . Fifty records were r and omly sample d from records initially found to show nonnegligent adverse events , and the remaining 50 were sample d from records initially judged to show negligent adverse events . This mix provided a sample similar to the one originally review ed by the physicians . Record Review We completed three independent physician review s of the original 500 medical records ( not photocopies ) . The first was the original review for the UCMPS . The two subsequent physician review s were conducted by physicians from the original UCMPS and by newly recruited physician review ers who were trained in the same manner as the original UCMPS physician review ers ( 5 ) . Physicians could participate in more than one review ; however , all review ers were blinded to the purpose of these additional review s , and none of them review ed a record that they had previously review ed . All physician review ers used the same data form , which included the same definition of adverse event : an injury caused by medical management ( rather than the disease process ) that result ed in prolonged hospital stay or disability at discharge . Negligence was defined as care that fell below the st and ard expected of physicians in the community . Because judgments about adverse events may be complex , review ers used a six-point confidence scale that has been used in previous studies ( 5 , 6 ) . We required a confidence score of at least four ( > 50 % chance that medical management caused the adverse event ) to indicate the presence of adverse events or negligent adverse events . For each of the three review s , two investigators independently review ed all detected adverse events to confirm that they met the study criteria . Statistical Analysis We calculated the rates of adverse events and negligent adverse events detected during each review . We report statistics for adverse events and negligent adverse events as the measure of inter-rater reliability . For all statistical analyses , we used SAS software , release 6.12 ( SAS Institute , Inc. , Cary , North Carolina ) . Role of the Funding Source The funding source contributed to the design of the study but had no role in conducting the study or in reporting its results . Results The first independent review of the study sample detected 64 adverse events ; the second , 95 ; and the third , 86 . Table 1 shows the agreement and disagreement among the three review s. Comparisons between sets of review s ( review 1 compared with review 2 , review 1 compared with review 3 , and review 2 compared with review 3 ) demonstrated similar reliability . The statistics ranged from 0.40 to 0.41 ( the lowest bound of the three 95 % CIs was 0.30 , and the highest was 0.51 ) for adverse events and from 0.19 to 0.24 ( the lowest bound of the 95 % CIs was 0.05 , and the highest was 0.37 ) for negligent adverse events . The former is considered moderate ; the latter is considered poor ( 11 ) . We had hypothesized that reliability would increase as the review er confidence score required to indicate an adverse event decreased . However , when adverse events were defined as having a confidence score of two or greater instead of four or greater , the overall statistic for adverse events decreased slightly , from 0.41 to 0.37 ( P = 0.19 ) . Table 1 . Number of Cases Result ing in Agreement and Disagreement among Three Sets of Review s Different review strategies produced different estimates of the total number of adverse events and negligent adverse events . Increasing the confidence score to indicate the presence of an adverse event and increasing the degree of consensus between independent review ers ( agreement of one , two , or three review ers ) substantially affected adverse event rates in this sample of 500 records ( Table 2 ) . For example , if all three review ers had a confidence score of four or greater , the adverse event rate would be 7.58 % and the negligent adverse event rate would be 0.82 % . If we required a score of four or greater on only one of three review s to confirm the presence of an adverse event , the adverse event rate would be 37.68 % and the negligent adverse event rate would be 15.13 % . When the required confidence score was decreased to two , estimates among the three review ers varied even more . Table 2 . Adverse Event Rates Using Different Thresholds Discussion We found moderate to poor inter-rater reliability among physicians trying to identify adverse events and negligent adverse events by medical record review . Increasing the required levels of review er confidence to indicate the presence of an event or increasing the number of physician review ers who detected an event result ed in markedly different event rates . Similar studies of medical record review have found almost identical reliability ( statistics of approximately 0.4 for both studies ) ( 6 , 7 ) . Our study is limited because our data come from hospital records in two states that may not be generalizable to other geographic locations . Furthermore , we could not measure the validity of chart review because there is no true gold st and ard that avoids some type of implicit assessment . Despite the poor to moderate reliability of medical record review , each of the adverse events identified represents a potential opportunity for quality improvement ( 12 , 13 ) . Each case involving an adverse event may contain valuable information for improving patient safety . However , research ers should use caution when estimating incidence and prevalence of errors solely on the basis of these data . For example , the IOM used data from our larger study ( 9 ) , which used medical record review , to estimate that 44 000 Americans die each year of preventable adverse events ( or , to use the IOM term , medical errors ) ( 8) . Our results suggest that these estimates are sensitive to the review er consensus and confidence required to indicate the presence of adverse events . The estimate of 44 000 deaths could be approximately 50 % lower if the study used to estimate that figure required independent agreement by two physician review ers and could be 30 % higher if the required review er confidence about the presence of an adverse event was lower ( Table 2 ) . We can not quantify the reduction more precisely because the IOM figures are based on judgments about the number of preventable adverse events and our results are based on judgments about adverse events and negligent adverse events . The figures for error-related death reported by the IOM are imprecise . However , it is shortsighted to focus on the exact number of deaths and thereby ignore the vast additional research on errors and adverse events cited by the IOM ( 8 , 14 ) . Regardless of whether the number of annual U.S. deaths due to medical error is 30 000 or 300 000 , the need to test interventions to reduce errors and adverse events is clear . Our findings suggest that persons and institutions Patient injuries are thought to have a substantial financial impact on the health care system , but recent studies have been limited to estimating the costs of adverse drug events in teaching hospitals . This analysis estimated the costs of all types of patient injuries from a representative sample of hospitals in Utah and Colorado . We detected 459 adverse events ( of which 265 were preventable ) by review ing the medical records of 14,732 r and omly selected 1992 discharges from 28 hospitals . The total costs ( all results are discounted 1996 dollars ) were $ 661,889,000 for adverse events , and $ 308,382,000 for preventable adverse events . Health care costs totaled $ 348,081,000 for all adverse events and $ 159,245,000 for the preventable adverse events . Fifty-seven percent of the adverse event health care costs , and 46 % of the preventable adverse event costs were attributed to outpatient medical care . Surgical complications , adverse drug events , and delayed or incorrect diagnoses and therapies were the most expensive types of adverse events . The costs of adverse events were similar to the national costs of caring for people with HIV/AIDS , and totaled 4.8 % of per capita health care expenditures in these states Background : An unplanned admission to the intensive care unit within 24 h of a procedure ( UIA ) is a recommended clinical indicator in surgical patients . Often regarded as a surrogate marker of adverse events , it has potential as a direct measure of patient safety . Its true validity for such use is currently unknown . Methods : The authors vali date d UIA as an indicator of safety in surgical patients in a prospect i ve cohort study of 44,130 patients admitted to their hospital . They assessed the association of UIA with intraoperative incidents and near misses , increased hospital length of stay , and 30-day mortality as three constructs of patient safety . Results : The authors identified 201 patients with a UIA ; 104 ( 52.2 % ) had at least one incident or near miss . After adjusting for confounders , these incidents were significantly associated with UIA in all categories of surgical procedures analyzed ; odds ratios were 12.21 ( 95 % confidence interval [ CI ] , 6.33–23.58 ) , 4.06 ( 95 % CI , 2.74–6.03 ) , and 2.13 ( 95 % CI , 1.02–4.42 ) , respectively . The 30-day mortality for patients with UIA was 10.9 % , compared with 1.1 % in non-UIA patients . After risk adjustment , UIA was associated with excess mortality in several types of surgical procedures ( odds ratio , 3.89 ; 95 % CI , 2.14–7.04 ) . The median length of stay was increased if UIA occurred : 16 days ( interquartile range , 10–31 ) versus 2 days ( interquartile range , 0.5–9 ) ( P < 0.001 ) . For patients with a UIA , the likelihood of discharge from hospital was significantly decreased in most surgical categories analyzed , with adjusted hazard ratios of 0.41 ( 95 % CI , 0.23–0.77 ) to 0.58 ( 95 % CI , 0.37–0.93 ) . Conclusions : These findings provide strong support for the construct validity of UIA as a measure of patient safety AIM To identify the effect of an ICU Liaison Nurse ( LN ) on major adverse events in patients recently discharged from the ICU . METHODS Case-control study using a chart audit protocol to assess controls retrospectively and cases prospect ively . Controls did not receive ICU-based follow-up care . Cases received at least three visits over 3 days from the ICU LN . The LN service operated 7 days/week 0800 - 1800 . Data on a range of predictors and three major adverse events ( unexpected death , surgical procedure needed , and transfer to a higher level of care ) were collected using a purpose built audit form . RESULTS A total of 388 patients ( 201 controls and 187 cases ) were included in the study . Demographic and clinical characteristics were similar for both groups . A total of 165 major adverse events were identified in 129 patients . After controlling for all other potential predictors , patients who received the LN intervention were 1.82 times more likely to be transferred to a higher level of care ( P=0.028 ) and 2.11 times more likely to require a surgical procedure ( P=0.006 ) . Surgical patients were 7.20 times as likely to require a surgical procedure ( P<0.001 ) . CONCLUSIONS Our results support the cl aim that ICU LN has a role in preventing adverse events . However as the control data was retrospective and the study was conducted at one site , other unknown factors may have influenced the results Background : Research into adverse events ( AEs ) has highlighted the need to improve patient safety . AEs are unintended injuries or complications result ing in death , disability or prolonged hospital stay that arise from health care management . We estimated the incidence of AEs among patients in Canadian acute care hospitals . Methods : We r and omly selected 1 teaching , 1 large community and 2 small community hospitals in each of 5 provinces ( British Columbia , Alberta , Ontario , Quebec and Nova Scotia ) and review ed a r and om sample of charts for nonpsychiatric , nonobstetric adult patients in each hospital for the fiscal year 2000 . Trained review ers screened all eligible charts , and physicians review ed the positively screened charts to identify AEs and determine their preventability . Results : At least 1 screening criterion was identified in 1527 ( 40.8 % ) of 3745 charts . The physician review ers identified AEs in 255 of the charts . After adjustment for the sampling strategy , the AE rate was 7.5 per 100 hospital admissions ( 95 % confidence interval [ CI ] 5.7– 9.3 ) . Among the patients with AEs , events judged to be preventable occurred in 36.9 % ( 95 % CI 32.0%–41.8 % ) and death in 20.8 % ( 95 % CI 7.8%–33.8 % ) . Physician review ers estimated that 1521 additional hospital days were associated with AEs . Although men and women experienced equal rates of AEs , patients who had AEs were significantly older than those who did not ( mean age [ and st and ard deviation ] 64.9 [ 16.7 ] v. 62.0 [ 18.4 ] years ; p = 0.016 ) . Interpretation : The overall incidence rate of AEs of 7.5 % in our study suggests that , of the almost 2.5 million annual hospital admissions in Canada similar to the type studied , about 185 000 are associated with an AE and close to 70 000 of these are potentially preventable Objective : This study determined the incidence , type , nature , preventability and impact of adverse events ( AEs ) among hospitalised patients and potentially preventable deaths in Dutch hospitals . Methods : Using a three-stage retrospective record review process , trained nurses and doctors review ed 7926 admissions : 3983 admissions of deceased hospital patients and 3943 admissions of discharged patients in 2004 , in a r and om sample of 21 hospitals in the Netherl and s ( 4 university , 6 tertiary teaching and 11 general hospitals ) . A large sample of deceased patients was included to determine the occurrence of potentially preventable deaths in hospitals more precisely . Results : One or more AEs were found in 5.7 % ( 95 % CI 5.1 % to 6.4 % ) of all admissions and a preventable AE in 2.3 % ( 95 % CI 1.9 % to 2.7 % ) . Of all AEs , 12.8 % result ed in permanent disability or contributed to death . The proportion of AEs and their impact increased with age . More than 50 % of the AEs were related to surgical procedures . Among deceased hospital patients , 10.7 % ( 95 % CI 9.8 % to 11.7 % ) had experienced an AE . Preventable AEs that contributed to death occurred in 4.1 % ( 95 % CI 3.5 % to 4.8 % ) of all hospital deaths . Extrapolating to a national level , between 1482 and 2032 potentially preventable deaths occurred in Dutch hospitals in 2004 . Conclusions : The incidence of AEs , preventable AEs and potentially preventable deaths in the Netherl and s is substantial and needs to be reduced . Patient safety efforts should focus on surgical procedures and older patients Abstract Objectives To compare the effectiveness , reliability , and acceptability of estimating rates of adverse events and rates of preventable adverse events using three methods : cross sectional ( data gathered in one day ) , prospect i ve ( data gathered during hospital stay ) , and retrospective ( review of medical records ) . Design Independent assessment of three methods applied to one sample . Setting 37 wards in seven hospitals ( three public , four private ) in southwestern France . Participants 778 patients : medical ( n = 278 ) , surgical ( n = 263 ) , and obstetric ( n = 237 ) . Main outcome measures The main outcome measures were the proportion of cases ( patients with at least one adverse event ) identified by each method compared with a reference list of cases confirmed by ward staff and the proportion of preventable cases ( patients with at least one preventable adverse event ) . Secondary outcome measures were inter-rater reliability of screening and identification , perceived workload , and face validity of results . Results The prospect i ve and retrospective methods identified similar numbers of medical and surgical cases ( 70 % and 66 % of the total , respectively ) but the prospect i ve method identified more preventable cases ( 64 % and 40 % , respectively ) , had good reliability for identification ( κ= 0.83 ) , represented an acceptable workload , and had higher face validity . The cross sectional method showed a large number of false positives and identified none of the most serious adverse events . None of the methods was appropriate for obstetrics . Conclusion The prospect i ve method of data collection may be more appropriate for epidemiological studies that aim to convince clinical teams that their errors contribute significantly to adverse events , to study organisational and human factors , and to assess the impact of risk reduction programmes Background To compare two approaches to the statistical analysis of the relationship between the baseline incidence of adverse events and the effect of medical emergency teams ( METs ) . Methods Using data from a cluster r and omized controlled trial ( the MERIT study ) , we analysed the relationship between the baseline incidence of adverse events and its change from baseline to the MET activation phase using quadratic modelling techniques . We compared the findings with those obtained with conventional subgroup analysis . Results Using linear and quadratic modelling techniques , we found that each unit increase in the baseline incidence of adverse events in MET hospitals was associated with a 0.59 unit subsequent reduction in adverse events ( 95%CI : 0.33 to 0.86 ) after MET implementation and activation . This applied to cardiac arrests ( 0.74 ; 95%CI : 0.52 to 0.95 ) , unplanned ICU admissions ( 0.56 ; 95%CI : 0.26 to 0.85 ) and unexpected deaths ( 0.68 ; 95%CI : 0.45 to 0.90 ) . Control hospitals showed a similar reduction only for cardiac arrests ( 0.95 ; 95%CI : 0.56 to 1.32 ) . Comparison using conventional subgroup analysis , on the other h and , detected no significant difference between MET and control hospitals . Conclusions Our study showed that , in the MERIT study , when there was dependence of treatment effect on baseline performance , an approach based on regression modelling helped illustrate the nature and magnitude of such dependence while sub-group analysis did not . The ability to assess the nature and magnitude of such dependence may have policy implication s. Regression technique may thus prove useful in analysing data when there is a conditional treatment effect OBJECTIVES . Currently there are few practical methods to identify and measure harm to hospitalized children . Patients in NICUs are at high risk and warrant a detailed assessment of harm to guide patient safety efforts . The purpose of this work was to develop a NICU-focused tool for adverse event detection and to describe the incidence of adverse events in NICUs identified by this tool . METHODS . A NICU-focused trigger tool for adverse event detection was developed and tested . Fifty patients from each site with a minimum 2-day NICU stay were r and omly selected . All adverse events identified using the trigger tool were evaluated for severity , preventability , ability to mitigate , ability to identify the event earlier , and presence of associated occurrence report . Each trigger , and the entire tool , was evaluated for positive predictive value . Study chart review ers , in aggregate , identified 88.0 % of all potential triggers and 92.4 % of all potential adverse events . RESULTS . Review of 749 r and omly selected charts from 15 NICUs revealed 2218 triggers or 2.96 per patient , and 554 unique adverse events or 0.74 per patient . The positive predictive value of the trigger tool was 0.38 . Adverse event rates were higher for patients < 28 weeks ' gestation and < 1500 g birth weight . Fifty-six percent of all adverse events were deemed preventable ; 16 % could have been identified earlier , and 6 % could have been mitigated more effectively . Only 8 % of adverse events were identified in existing hospital-based occurrence reports . The most common adverse events identified were nosocomial infections , catheter infiltrates , and abnormal cranial imaging . CONCLUSIONS . Adverse event rates in the NICU setting are substantially higher than previously described . Many adverse events result ed in permanent harm and the majority were classified as preventable . Only 8 % were identified using traditional voluntary reporting methods . Our NICU-focused trigger tool appears efficient and effective at identifying adverse events BACKGROUND The ongoing debate on the incidence and types of iatrogenic injuries in American hospitals has been informed primarily by the Harvard Medical Practice Study , which analyzed hospitalizations in New York in 1984 . The generalizability of these findings is unknown and has been question ed by other studies . OBJECTIVE We used methods similar to the Harvard Medical Practice Study to estimate the incidence and types of adverse events and negligent adverse events in Utah and Colorado in 1992 . DESIGN AND SUBJECTS We selected a representative sample of hospitals from Utah and Colorado and then r and omly sample d 15,000 nonpsychiatric 1992 discharges . Each record was screened by a trained nurse- review er for 1 of 18 criteria associated with adverse events . If > or = 1 criteria were present , the record was review ed by a trained physician to determine whether an adverse event or negligent adverse event occurred and to classify the type of adverse event . MEASURES The measures were adverse events and negligent adverse events . RESULTS Adverse events occurred in 2.9+/-0.2 % ( mean+/-SD ) of hospitalizations in each state . In Utah , 32.6+/-4 % of adverse events were due to negligence ; in Colorado , 27.4+/-2.4 % . Death occurred in 6.6+/-1.2 % of adverse events and 8.8+/-2.5 % of negligent adverse events . Operative adverse events comprised 44.9 % of all adverse events ; 16.9 % were negligent , and 16.6 % result ed in permanent disability . Adverse drug events were the leading cause of nonoperative adverse events ( 19.3 % of all adverse events ; 35.1 % were negligent , and 9.7 % caused permanent disability ) . Most adverse events were attributed to surgeons ( 46.1 % , 22.3 % negligent ) and internists ( 23.2 % , 44.9 % negligent ) . CONCLUSIONS The incidence and types of adverse events in Utah and Colorado in 1992 were similar to those in New York State in 1984 . Iatrogenic injury continues to be a significant public health problem . Improving systems of surgical care and drug delivery could substantially reduce the burden of iatrogenic injury An audit of 265 intensive care unit ( ICU ) admissions from the operating room was performed for the year 1991 . In a quality assurance exercise we identified 34 unanticipated ICU admissions ( UIAs ) by a retrospective peer review of the medical charts . Of these UIAs , 16 were deemed predictable and seven preventable . Five of the seven potentially preventable UIAs were judged to have had inappropriate intravenous fluid management . This has prompted changes in our education programme . In an assessment of our re source management , we evaluated prospect ively collected data on the Apache II scores on the day of admission , the incidence of ICU-specific interventions , length of stay in ICU , and outcomes . ICU-specific interventions were not initially required in 36 % of admissions and these patients had a low risk ( 1.1 % ) of eventually requiring ICU-specific interventions . In comparison with patients requiring ICU-specific interventions , they had lower Apache II scores ( 10.2 vs 13.1 ) , shorter ICU stays ( medians of one vs two days ) , lower ICU mortality ( 0 vs 8.2 % ) , P < 0.05 , but hospital mortality was not different ( 7.4 vs 15.3 % ) . This audit has prompted re-organisation of our intensive care services , so that patients not requiring ICU-specific interventions will be managed in an intermediate care area with nurse.patient ratios of 1:3 or 4 , in comparison with 1:1 or 2 ratios in the intensive care area . RésuméUne vérification de 265 admissions dirigées de la salle d’opérations vers l’unité des soins intensifs ( SI ) a été réalisée pour l’année 1991 . Lors d’un exercice d’apréciation de la qualité de l’acte médical , nous avons identifié 34 admissions aux SI non prévisibles lors d’une étude rétrospective de révision des dossiers médicaux . De ces admissions non anticipées , seize furent jugées prévisibles et sept évitables . Dans cinq des sept admissions considérées comme évitables , un déséquilibre liquidien était en cause . Ceci a provoqué des changements à notre programme d’enseignement . Dans l’évaluation de l’utilisation des res sources , nous avons évalué de façon prospect i ve les données récoltées de scores Apache II le jour de l’admission , l’incidence des interventions spécifiques à une unité SI , la durée et le bilan du séjour aux SI . Des interventions spécifiques aux SI ne furent pas initialement requises dans 36 % des admissions et ces malades avaient un faible taux de probabilité d ’ interventions spécifiques ultérieures . En comparaison avec des patients nécessitant ces interventions , ils avaient des scores Apache II plus bas ( 10,2 vs 13,1 ) , des durées de séjour plus court ( médiane 1 vs 2 jours ) , et un taux de mortalité aux SI favorable ( 0 vs 8,2%)P < 0,05 , mais par contre une mortalité hospitalière non significativement différente ( 7,4 vs 15,3 % ) . Cette vérification a nécessité une réorganisation de nos services de SI de sorte que les patients qui n’ont pas besoin d’intervention spécifiques aux SI sont traités dans des zones de soins intermédiates comportant un rapport nurse.patient de 1:3 ou 4 , au lieu de 1:1 ou 2 dans une zone de SI The objective of the present study was to determine the rate of adverse patient occurrences in a medium sized hospital and to assess the effectiveness and efficiency of limited adverse occurrence screening as a method of medical quality control . The medical records of in patients discharged from a base hospital in Horsham , Victoria , were screened by the medical records department using eight general outcome criteria . Histories found to meet a criterion were sent to a medical review er to determine if an adverse patient occurrence had taken place . A r and om sample of histories not meeting any criteria was also review ed . The main outcome measures were the rate of adverse patient occurrences and the proportion of these events detected by limited occurrence screening , the accuracy of the screening process , the time taken and the cost of finding adverse events . The total adverse patient occurrence rate was estimated to be 2.75 % ( 95 % CI 1.36 - 4.14 % ) . Limited adverse occurrence screening using eight screening criteria detected 49.1 % ( 95 % CI 32.6 - 99.3 % ) of all adverse patient occurrences and 64.4 % ( 95 % CI 37.8 - 100 % ) of all adverse occurrences of major severity . This was achieved by review ing the records of 9.72 % of all patients discharged . Screening was quick and accurate ( false positive rate 2.0 % , false negative rate 0.4 % ) . Medical review took on average 5 min ( s.d . + /- 3.03 ) . The method required 500 h of staff time over one year and cost $ 22000 ( 0.1 % of total hospital budget ) . The proportion of adverse patient occurrences found by limited screening was much higher than that found by traditional quality assurance methods . Limited adverse occurrence screening using retrospective review requires a small proportion of total budget that should be available to most hospitals for medical quality assurance activities
11,024
31,321,579
Moreover , the combined results showed a significant difference in the TAS , MDA , and CAT levels in patients with mild vs. moderate psoriasis and moderate vs. severe psoriasis . TAS and CAT levels in psoriasis patients were significantly lower than in healthy controls , whereas the TOS and MDA levels were significantly higher . Furthermore , the TAS , MDA , and CAT levels are associated with the severity of disease . These results indicate that redox imbalances play a major role in the pathogenesis of psoriasis
Although oxidative stress plays a major role in psoriasis , the association between oxidative stress biomarker levels and psoriasis in humans remains controversial .
OBJECTIVE There is growing evidence supporting the reactive oxygen species ( ROS ) in the pathogenesis of psoriasis . Propylthiouracil(PTU ) , an antithyroid drug , has been shown to have beneficial effects on psoriasis . The aim of this study was to investigate both disturbances in oxidant/antioxidant system in psoriasis and whether PTU , shown to have immunomodulatory effects and antioxidant potential , has effects on oxidant/antioxidant system and clinical improvement in psoriatics . DESIGN AND METHODS Malondialdehyde ( MDA ) , end product of lipid peroxidation , superoxide dismutase ( SOD ) and glutathione peroxidase ( GSH-Px ) , and antioxidant enzymes were measured in plasma , erythrocytes and skin biopsies of psoriatics who were resistant to conventional therapy before and after 8 weeks of oral treatment with PTU ( 300 mg/day ) or PTU/thyroxine ( 25 microg/day- to prevent possible hypothyroidism ) . The same parameters were also studied in healthy controls . Psoriasis Area and Severity Index ( PASI ) scores were used to evaluate the severity of the disease , and routine analyses and thyroid function tests were measured during the study . RESULTS Increased baseline MDA in all sample s were found to be lower . In addition baseline SOD and GSH-Px in skin and erythrocytes were also lower . The increased plasma SOD levels in skin and erythrocytes of the study groups was found to be higher and lower , respectively in all patients after the treatment . No tissue parameters or erythrocyte GSH-Px were different from control levels at the end of the study . Significant clinical improvement and decreased PASI scores were observed in all patients . Post treatment TSH levels were higher in all patients , but these levels were within the reference range and none had clinical hypothyroidism . CONCLUSION These findings may provide some evidence for a potential role of increased lipid peroxidation and decreased antioxidant activity in psoriasis . PTU may be considered as treatment model in psoriasis , in particular for resistant cases , because of its antioxidant potential , and also antiproliferative and immunomodulatory effects BACKGROUND Psoriasis is a common , an inflammatory skin disease . Trace elements may play an active role in the pathogenesis of psoriasis . OBJECTIVE The aim of this study was to estimate the concentration of selenium ( Se ) , zinc ( Zn ) , copper ( Cu ) and Cu/Zn ratio as well as total antioxidant status ( TAS ) and c-reactive protein ( CRP ) in the serum of patients with psoriasis . METHODS In this case-control study sixty patients with psoriasis and fifty-eight healthy people were examined . Serum levels of Se , Zn and Cu were determined by atomic absorption spectrometry . Cu/Zn ratio was calculated . TAS was measured spectrophotometrically . CRP was analyzed by immunoturbidimetric method . Clinical activity of psoriasis was evaluated using Psoriasis Area and Severity Index ( PASI ) . RESULTS Serum concentration of Se in patients with psoriasis ( 71.89±16.90μg/L ) was lower as compared to the control group ( 79.42±18.97μg/L ) and after NB-UVB . Cu level of patients was higher ( 1.151±0.320mg/L ) as compared to controls ( 1.038±0.336mg/L ) , but Zn level did not differ . We observed higher Cu/Zn ratio ( p<0.05 ) in examined patients than in the control group and after NB-UVB . We found decrease TAS before and after NB-UVB . CRP levels was found to be normal range . A significant correlation coefficient between CRP and Cu/Zn was observed . CONCLUSIONS The study showed some disturbances in the serum levels of trace elements and TAS in psoriatic patients Background Psoriasis is a chronic , inflammatory skin condition associated with a high frequency of cardiovascular events . Modifications of plasma lipids , and an increase in the levels of biochemical markers of inflammation and lipid peroxidation have been reported in subjects with psoriasis , suggesting a relationship between psoriasis , inflammation and oxidative damage The aim of this research was to determine levels in blood of vitamin E , beta carotene , lipid peroxidation as malondialdehyde ( MDA ) , reduced glutathione ( GSH ) and glutathione peroxidase ( GSH-Px ) activity in patients with psoriasis . Studies were carried out on 34 patients with moderate and severe psoriases and healthy age-matched controls . Red blood cell ( RBC ) and plasma sample s from healthy and patient subjects were taken . Levels of GSH and the activity of GSH-Px in both plasma and RBC sample s were significantly ( P<0.001 ) lower in patients with psoriasis than in controls , whereas beta carotene levels in plasma and MDA levels in RBC sample s were significantly ( P<0.01 , P<0.001 ) higher in patients with psoriasis than in controls . However , vitamin E and MDA levels in plasma did not differ statistically . Although being far from conclusive , these results provide some evidence for a potential role of increased lipid peroxidation and decreased antioxidants in psoriasis
11,025
31,736,854
Conclusions : A decrease of tinnitus distress scores in MBIs can be observed directly post-therapy based on moderate to high quality studies . This was found regardless of the heterogeneity of patients , study design , type of MBI and outcome assessment . No effect of MBIs was observed for depression and anxiety in tinnitus patients . Long term effects remain uncertain .
Objectives : With this systematic review we aim to provide an overview of the evidence of the effect of Mindfulness Based Interventions ( MBIs ) on ( 1 ) tinnitus distress and ( 2 ) anxiety and /or depression in tinnitus patients .
Background : Tinnitus is experienced by up to 15 % of the population and can lead to significant disability and distress . There is rarely a medical or surgical target and psychological therapies are recommended . We investigated whether mindfulness-based cognitive therapy ( MBCT ) could offer an effective new therapy for tinnitus . Methods : This single-site r and omized controlled trial compared MBCT to intensive relaxation training ( RT ) for chronic , distressing tinnitus in adults . Both treatments involved 8 weekly , 120-min sessions focused on either relaxation ( RT ) or mindfulness meditation ( MBCT ) . Assessment s were completed at baseline and at treatment commencement 8 weeks later . The primary outcomes were tinnitus severity ( Tinnitus Question naire ) and psychological distress ( Clinical Outcomes in Routine Evaluation - Non-Risk , CORE-NR ) , 16 weeks after baseline . The analysis utilized a modified intention-to-treat approach . Results : A total of 75 patients were r and omly allocated to MBCT ( n = 39 ) or RT ( n = 36 ) . Both groups showed significant reductions in tinnitus severity and loudness , psychological distress , anxiety , depression , and disability . MBCT led to a significantly greater reduction in tinnitus severity than RT , with a mean difference of 6.3 ( 95 % CI 1.3 - 11.4 , p = 0.016 ) . Effects persisted 6 months later , with a mean difference of 7.2 ( 95 % CI 2.1 - 2.3 , p = 0.006 ) and a st and ardized effect size of 0.56 ( 95 % CI 0.16 - 0.96 ) . Treatment was effective regardless of initial tinnitus severity , duration , or hearing loss . Conclusions : MBCT is effective in reducing tinnitus severity in chronic tinnitus patients compared to intensive RT . It also reduces psychological distress and disability . Future studies should explore the generalizability of this approach and how outcome relates to different aspects of the intervention The Tinnitus Research Consortium ( TRC ) issued a Request for Proposals in 2003 to develop a new tinnitus outcome measure that would : ( 1 ) be highly sensitive to treatment effects ( vali date d for " responsiveness " ) ; ( 2 ) address all major dimensions of tinnitus impact ; and ( 3 ) be vali date d for scaling the negative impact of tinnitus . A grant was received by M. Meikle to conduct the study . In that observational study , all of the TRC objectives were met , with the final 25-item Tinnitus Functional Index ( TFI ) containing eight subscales . The study was published in 2012 , and since then the TFI has received increasing international use and is being translated into at least 14 language s. The present study utilized data from a r and omized controlled trial ( RCT ) that involved testing the efficacy of " telephone tinnitus education " as intervention for bothersome tinnitus . These data were used to confirm results from the original TFI study . Overall , the TFI performed well in the RCT with Cohen 's d being 1.23 . There were large differences between the eight different subscales , ranging from a mean 13.2-point reduction ( for the Auditory subscale ) to a mean 26.7-point reduction ( for the Relaxation subscale ) . Comparison of TFI performance was made with the Tinnitus H and icap Inventory . All of the results confirmed sensitivity of the TFI along with its subscales . This article is part of a Special Issue entitled Objectives Research suggests that an 8-week Mindfulness-Based Stress Reduction ( MBSR ) program ( a structured form of meditation ) might be effective in the treatment of various health problems including chronic pain . Our objective was to compare the clinical effectiveness of the MBSR program with a multidisciplinary pain intervention ( MPI ) program in terms of pain intensity , pain-related distress , quality of life , and mood in patients with chronic pain . Methods A r and omized , comparative clinical trial was conducted , including 6-month posttreatment follow-up . Ninety-nine participants , aged 24 to 64 years , with pain for a minimum of 3 months , were recruited from community-based clinics , hospitals , and community service centers . Participants were r and omly allocated to either the MBSR program ( 51 participants ) or a MPI program ( 48 participants ) . The study used vali date d Chinese versions of self-reported question naires measuring pain , mood symptoms , and health-related quality of life . Results Thirty-nine participants ( 77 % ) completed the MBSR program and 44 ( 90 % ) completed the MPI program . Patients in both the groups were comparable with regard to demographical characteristics , pain intensity , mood symptoms , and health-related quality -of-life measures before intervention . In both the groups , patients who completed the trial demonstrated statistically significant improvements in pain intensity and pain-related distress . However , no statistically significant differences were observed in overall results between the MBSR and MPI groups . Conclusions This r and omized , clinical trial showed that both MBSR and MPI programs reduced pain intensity and pain-related distress although no statistically significant differences were observed between the 2 groups and the improvements were small Background Tinnitus , the perception of sound in absence of an external acoustic source , impairs the quality of life in 2 % of the population . Since in most cases causal treatment is not possible , the majority of therapeutic attempts aim at developing and strengthening individual coping and habituation strategies . Therapeutic interventions that incorporate training in mindfulness meditation have become increasingly popular in the treatment of stress-related disorders . Here we conducted a r and omized , controlled clinical study to investigate the efficacy of a specific mindfulness- and body-psychotherapy based program in patients suffering from chronic tinnitus . Methods Thirty-six patients were enrolled in this pilot study . The treatment was specifically developed for tinnitus patients and is based on mindfulness and body psychotherapy . Treatment was performed as group therapy at two training weekends that were separated by an interval of 7 weeks ( eleven hours/weekend ) and in four further two-hour sessions ( week 2 , 9 , 18 and 22 ) . Patients were r and omized to receive treatment either immediately or after waiting time , which served as a control condition . The primary study outcome was the change in tinnitus complaints as measured by the German Version of the Tinnitus Question naire ( TQ ) . Results ANOVA testing for the primary outcome showed a significant interaction effect time by group ( F = 7.4 ; df = 1,33 ; p = 0.010 ) . Post hoc t-tests indicated an amelioration of TQ scores from baseline to week 9 in both groups ( intervention group : t = 6.2 ; df = 17 ; p < 0.001 ; control group : t = 2.5 ; df = 16 ; p = 0.023 ) , but the intervention group improved more than the control group . Groups differed at week 7 and 9 , but not at week 24 as far as the TQ score was concerned . Conclusions Our results suggest that this mindfulness- and body-psychotherapy-based approach is feasible in the treatment of tinnitus and merits further evaluation in clinical studies with larger sample sizes . The study is registered with clinical trials.gov ( NCT01540357 ) UNLABELLED We conducted a r and omized clinical trial to examine the relative effectiveness of two psychological interventions for treating tinnitus . People with tinnitus were initially offered a single session of psychoeducation about tinnitus , followed 2 months later by six weekly sessions of either mindfulness or relaxation training . Results indicated benefits from psychoeducation in reducing negative emotions , rumination and psychological difficulties of living with tinnitus . These effects were maintained or enhanced by mindfulness training that also emphasized acceptance , although they were eroded in the relaxation condition over the follow-up . Mediating processes are discussed , and suggestions for refining clinical interventions for this population are offered . KEY PRACTITIONER MESSAGE The present results suggest that mindfulness training might constitute a useful addition to psychoeducation for interventions targeting the psychological consequences of tinnitus BACKGROUND Chronic tinnitus is a frequent symptom presentation in clinical practice . No drug treatment to date has shown itself to be effective . The aim of the present study was to investigate the effects of cognitive behavioural therapy and meditation in tinnitus sufferers . METHODOLOGY Patients were selected from a dedicated tinnitus clinic in the Welsh Hearing Institute . A waiting list control design was used . Twenty-five chronic tinnitus sufferers were consecutively allocated to two groups , one receiving a cognitive behavioural therapy/meditation intervention of four one hour sessions with the other group waiting three months and subsequently treated in the same way , thereby acting as their own control . The main outcome was measured using the Hallam tinnitus question naire . A four to six month follow up was conducted . RESULTS These showed significant statistical reductions in tinnitus variables both in the active and also in the control group . Post-therapy , no significant change was found after the waiting list period . The improvement was maintained at the four to six month period . CONCLUSION The positive findings give support for the use of cognitive behavioural therapy/meditation for chronic tinnitus sufferers Background : Because of specific method ological difficulties in conducting r and omized trials , surgical research remains dependent predominantly on observational or non‐r and omized studies . Few vali date d instruments are available to determine the method ological quality of such studies either from the reader 's perspective or for the purpose of meta‐ analysis . The aim of the present study was to develop and vali date such an instrument & NA ; The objectives of this pilot study were to assess the feasibility of recruitment and adherence to an eight‐session mindfulness meditation program for community‐dwelling older adults with chronic low back pain ( CLBP ) and to develop initial estimates of treatment effects . It was design ed as a r and omized , controlled clinical trial . Participants were 37 community‐dwelling older adults aged 65 years and older with CLBP of moderate intensity occurring daily or almost every day . Participants were r and omized to an 8‐week mindfulness‐based meditation program or to a wait‐list control group . Baseline , 8‐week and 3‐month follow‐up measures of pain , physical function , and quality of life were assessed . Eighty‐nine older adults were screened and 37 found to be eligible and r and omized within a 6‐month period . The mean age of the sample was 74.9 years , 21/37 ( 57 % ) of participants were female and 33/37 ( 89 % ) were white . At the end of the intervention 30/37 ( 81 % ) participants completed 8‐week assessment s. Average class attendance of the intervention arm was 6.7 out of 8 . They meditated an average of 4.3 days a week and the average minutes per day was 31.6 . Compared to the control group , the intervention group displayed significant improvement in the Chronic Pain Acceptance Question naire Total Score and Activities Engagement subscale ( P = .008 , P = .004 ) and SF‐36 Physical Function ( P = .03 ) . An 8‐week mindfulness‐based meditation program is feasible for older adults with CLBP . The program may lead to improvement in pain acceptance and physical function OBJECTIVE Psychotherapeutic interventions have been adopted effectively in the management of tinnitus for a long time . This study compared mindfulness meditation and relaxation therapy for management of tinnitus . METHODS In this r and omised controlled trial , patients were recruited for five sessions of mindfulness meditation or five sessions of relaxation therapy . Patients ' responses were evaluated using the Tinnitus Reaction Question naire as a primary outcome measure , and the Hospital Anxiety and Depression Scale , visual analogue scale and a health status indicator as secondary outcome measures . RESULTS A total of 86 patients were recruited . Thirty-four patients completed mindfulness meditation and 27 patients completed relaxation therapy . Statistically significant improvement was seen in all outcome measures except the health status indicator in both treatment groups . The change in treatment scores was greater in the mindfulness meditation group than in the relaxation therapy group . CONCLUSION This study suggests that although both mindfulness meditation and relaxation therapy are effective in the management of tinnitus , mindfulness meditation is superior to relaxation therapy OBJECTIVES Tinnitus is a very common experience , and although usually mild , in a significant proportion of people , it is intrusive , persistent , and disabling . This paper explores the lived experience of chronic disabling tinnitus , with the aim of underst and ing how distress and chronicity occur , and what might help to reduce this . DESIGN Nine individuals were interviewed 6 months after completing mindfulness-based cognitive therapy ( MBCT ) as part of a r and omized controlled trial . The results reported here focus on their experiences of tinnitus before receiving MBCT . METHODS Data were collected through semi-structured , face-to-face interviews with a clinical psychologist , and an interpretative phenomenological analysis approach was used . RESULTS Two supraordinate themes emerged . ' Living with tinnitus ' describes a range of significant and profound life changes that result from the condition . Tinnitus can be a life-altering condition affecting thoughts , emotions , attention , behaviour , and the social world . ' The health care journey ' shows how chronic distress was intensified by unhelpful health communications and alleviated by helpful consultations . CONCLUSIONS Tinnitus is a biopsychosocial condition , and associated distress is affected by cognitive , behavioural , attentional , and social factors . The individuals ' initial reactions to tinnitus interact with the responses of others , including health care professionals . The burden of tinnitus could be reduced by developing early interventions that offer clear , helpful , and realistic information about tinnitus and appropriate treatments . Statement of contribution What is already known on this subject ? Tinnitus is the experience of an internal sound without an external sound source . It can be troubling , disabling , and chronic and usually has no clear medical cause or medical treatment , but psychological interventions are promising . Cognitive , behavioural , and attentional factors play a role in distress and therapeutic outcome . Clinical encounters are improved by aligning patient and clinician and sharing decision-making . What does this study add ? It is the first in-depth study exploring how tinnitus distress and health care systems interact . It shows how a biopsychosocial approach to tinnitus may reduce tinnitus burden more effectively than a biomedical , diagnostic-focused approach . It indicates how effective early health care information could be used to reduce chronic tinnitus distress
11,026
31,890,292
ConclusionS ystematic review and meta- analysis suggests that lateral wedge insoles cause an overall slight reduction in the biomechanical parameters . Higher degrees do not show higher reductions than lower degrees .
Background Lateral wedge insoles are traditionally used to reduce the adduction moment that crosses the knee during walking in people with medial knee osteoarthritis . However , the best degree to reduce knee joint load is not yet well established .
Background Supporting 21st century health care and the practice of evidence -based medicine ( EBM ) requires ubiquitous access to clinical information and to knowledge-based re sources to answer clinical questions . Many questions go unanswered , however , due to lack of skills in formulating questions , crafting effective search strategies , and accessing data bases to identify best levels of evidence . Methods This r and omized trial was design ed as a pilot study to measure the relevancy of search results using three different interfaces for the PubMed search system . Two of the search interfaces utilized a specific framework called PICO , which was design ed to focus clinical questions and to prompt for publication type or type of question asked . The third interface was the st and ard PubMed interface readily available on the Web . Study subjects were recruited from interns and residents on an inpatient general medicine rotation at an academic medical center in the US . Thirty-one subjects were r and omized to one of the three interfaces , given 3 clinical questions , and asked to search PubMed for a set of relevant articles that would provide an answer for each question . The success of the search results was determined by a precision score , which compared the number of relevant or gold st and ard articles retrieved in a result set to the total number of articles retrieved in that set . Results Participants using the PICO templates ( Protocol A or Protocol B ) had higher precision scores for each question than the participants who used Protocol C , the st and ard PubMed Web interface . ( Question 1 : A = 35 % , B = 28 % , C = 20 % ; Question 2 : A = 5 % , B = 6 % , C = 4 % ; Question 3 : A = 1 % , B = 0 % , C = 0 % ) 95 % confidence intervals were calculated for the precision for each question using a lower boundary of zero . However , the 95 % confidence limits were overlapping , suggesting no statistical difference between the groups . Conclusion Due to the small number of search es for each arm , this pilot study could not demonstrate a statistically significant difference between the search protocol s. However there was a trend towards higher precision that needs to be investigated in a larger study to determine if PICO can improve the relevancy of search results OBJECTIVE To test the hypothesis that a custom-fit valgus knee brace and custom-made lateral wedge foot orthotic will have greatest effects on decreasing the external knee adduction moment during gait when used concurrently . DESIGN Proof-of-concept , single test session , crossover trial . SETTING Biomechanics laboratory within a tertiary care center . PARTICIPANTS Patients ( n=16 ) with varus alignment and knee osteoarthritis ( OA ) primarily affecting the medial compartment of the tibiofemoral joint ( varus gonarthrosis ) . INTERVENTIONS Custom-fit valgus knee brace and custom-made full-length lateral wedge foot orthotic . Amounts of valgus angulation and wedge height were tailored to each patient to ensure comfort . MAIN OUTCOME MEASURES The external knee adduction moment ( % body weight [BW]*height [ Ht ] ) , frontal plane lever arm ( cm ) , and ground reaction force ( N/kg ) , determined from 3-dimensional gait analysis completed under 4 r and omized conditions : ( 1 ) control ( no knee brace , no foot orthotic ) , ( 2 ) knee brace , ( 3 ) foot orthotic , and ( 4 ) knee brace and foot orthotic . RESULTS The reduction in knee adduction moment was greatest when concurrently using the knee brace and foot orthotic ( effect sizes ranged from 0.3 to 0.4 ) . The mean decrease in first peak knee adduction moment compared with control was .36 % BW*Ht ( 95 % confidence interval [ CI ] , -.66 to -.07 ) . This was accompanied by a mean decrease in frontal plane lever arm of .59 cm ( 95 % CI , -.94 to -.25 ) . CONCLUSIONS These findings suggest that using a custom-fit knee brace and custom-made foot orthotic concurrently can produce a greater overall reduction in the knee adduction moment , through combined effects in decreasing the frontal plane lever arm We examined if a subject-specific amount of lateral wedge added to a foot orthosis could alter knee mechanics to potentially reduce the progression of knee osteoarthritis in patients with medial knee osteoarthritis . Twenty individuals with medial knee osteoarthritis ( > /=2 Kellgren Lawrence grade ) were prescribed a custom laterally wedged foot orthotic device . The prescribed wedge amount was the minimal wedge amount that provided the maximum amount of pain reduction during a lateral step-down test . Following an accommodation period , all subjects returned to the laboratory for a gait analysis . Knee mechanics were collected as the subjects walked at an intentional walking speed . Walking in the laterally wedged orthotic device significantly reduced the peak adduction moment during early stance ( p < 0.01 ) compared to the nonwedged device . Similarly , the wedged orthotic device significantly reduced the knee adduction excursion from heel strike to peak adduction ( p < 0.01 ) compared to the nonwedged device . No differences in the peak adduction moment during propulsion or peak adduction during stance were observed between the orthotic conditions . A subject-specific laterally wedged orthotic device was able to reduce the peak knee adduction moment during early stance , which is thought to be associated with the progression of knee osteoarthritis . Previous studies on this device have reported issues associated with foot discomfort when using wedge amounts > 7 degrees ; however , no such issues were reported in this study . Therefore , providing a custom laterally wedged orthotic device may potentially increase compliance while still potentially reducing disease progression OBJECTIVE To compare the clinical effects of laterally wedged insoles and neutrally wedged insoles ( used as control ) in patients with medial femoro-tibial knee osteoarthritis ( OA ) . DESIGN 6-month prospect i ve r and omized controlled study . PATIENTS out patients with painful medial femoro-tibial knee OA . OUTCOME MEASURES patient 's overall assessment of disease activity ( 5 grade scale ) , WOMAC index subscales and concomitant treatments . STATISTICAL ANALYSIS Performed as an intention-to-treat analysis . Main criterion : improvement in the patient 's assessment of activity ( defined as a reduction of 1 grade or more at month 6 compared to baseline , and no intraarticular injection or lavage during the study ) . Secondary criteria for assessment : ( a ) improvement in the patient 's assessment of activity at months 1 and 3 compared to baseline , ( b ) improvement in the WOMAC subscales at months 1 , 3 and 6 , compared to baseline ( defined as an improvement of at least 30 % , and no intraarticular injection or lavage during the study ) and ( c ) concomitant therapies ( analgesics and NSAIDs ) . RESULTS The baseline characteristics of the 156 recruited patients ( 41 males , 115 females , mean age 64.8 years ) were not different in the two treatment groups . At months 1 , 3 and 6 the percentages of patients with improvement in assessment of disease activity , in WOMAC pain , joint stiffness , and physical functioning subscales were similar in the two groups . The number of days with NSAIDs intake during the previous 3 months was decreased at month 6 compared with baseline in the group furnished with laterally wedged insoles ( 14.1 days+/-28 vs 9.9 days+/-27 , P=0.04 , Wilcoxon paired test ) , while it remained unchanged in the other group ( 15.5 days+/-24 vs 15+/-28 , P=0.56 ) . Compliance and tolerance were satisfactory . Compliance was different between the two groups at month 6 , with a greater frequency of patients who wore insoles permanently in the laterally wedged insole group than in the other group ( 87.8 % vs 74.3%;P=0.032 ) . CONCLUSION This study failed to demonstrate a relevant short-term symptomatic effect of laterally-wedged insoles in medial femoro-tibial OA . However , the decrease in NSAIDs consumption together with better compliance in the treated group are in favor of a beneficial effect of laterally-wedged insoles in medial femoro-tibial OA Objective This study aim ed to determine whether the effect of laterally wedged insoles on the adduction moment in knee osteoarthritis ( OA ) declined after one month of wear , and whether higher reported use of insoles was associated with a reduced effect on the adduction moment at one month . Methods Twenty people with medial compartment OA underwent gait analysis in their own shoes wearing i ) no insoles and ; ii ) insoles wedged laterally 5 ° in r and om order . Testing occurred at baseline and after one month of use of the insoles . Participants recorded daily use of insoles in a log-book . Outcomes were the first and second peak external knee adduction moment and the adduction angular impulse , compared across conditions and time with repeated measures general linear models . Correlations were obtained between total insole use and change in gait parameters with used insoles at one month , and change scores were compared between high and low users of insoles using general linear models . Results There was a significant main effect for condition , whereby insoles significantly reduced the adduction moment ( all p < 0.001 ) . However there was no significant main effect for time , nor was an interaction effect evident . No significant associations were observed between total insole use and change in gait parameters with used insoles at one month , nor was there a difference in effectiveness of insoles between high and low users of the insoles at this time . Conclusion Effects of laterally wedged insoles on the adduction moment do not appear to decline after one month of continuous use , suggesting that significant wedge degradation does not occur over the short-term Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more Background Pronated foot posture is associated with many clinical and biomechanical outcomes unique to medial compartment knee osteoarthritis ( OA ) . Though shoe-worn insole treatment , including lateral wedges , is commonly studied in this patient population , their effects on the specific subgroup of people with medial knee OA and concomitant pronated feet are unknown . The purpose of this study was to evaluate whether lateral wedge insoles with custom arch support are more beneficial than lateral wedge insoles alone for knee and foot symptoms in people with medial tibiofemoral knee osteoarthritis ( OA ) and pronated feet . Methods Twenty-six people with pronated feet and symptomatic medial knee OA participated in a r and omized crossover study comparing five degree lateral wedge foot insoles with and without custom foot arch support . Each intervention was worn for two months , separated by a two-month washout period of no insoles wear . Main outcomes included the Western Ontario and McMaster Universities Osteoarthritis Index ( WOMAC ) pain and physical function subscales , the revised short-form Foot Function Index ( FFI-R ) pain and stiffness subscales , and the timed stair climb test . Regression modeling was conducted to examine treatment , period , and interaction effects . Results Twenty-two participants completed the study , and no carryover or interaction effects were observed for any outcome . Significant treatment effects were observed for the timed stair climb , with greater improvements seen with the lateral wedges with arch support . Within-condition significant improvements were observed for WOMAC pain and physical function , as well as FFI-R pain and stiffness with lateral wedges with arch support use . More adverse effects were reported with the lateral wedges alone , while more people preferred the lateral wedges with arch support overall . Conclusions Addition of custom arch support to a st and ard lateral wedge insole may improve foot and knee symptoms in people with knee OA and concomitant pronated feet . These preliminary findings suggest further research evaluating the role of shoe-worn insoles for treatment of this specific sub-group of people with knee OA is warranted . Trial registration Clinical trials.gov identifier : NCT02234895 Background The results of conservative treatment of knee osteoarthritis ( OA ) are generally evaluated in epidemiological studies with clinical outcome measures as primary outcomes . Biomechanical evaluation of orthoses shows that there are potentially beneficial biomechanical changes to joint loading ; however , evaluation in relation to clinical outcome measures in longitudinal studies is needed . Questions / purpose sWe asked ( 1 ) is there an immediate effect on gait in patients using a laterally wedged insole or valgus knee brace ; ( 2 ) is there a late ( 6 weeks ) effect ; and ( 3 ) is there a difference between subgroups within each group with respect to patient compliance , body mass index , and OA status ? Methods This was a secondary analysis of data from a previous r and omized controlled trial of patients with early medial knee OA . A total of 91 patients were enrolled in that trial , and 73 ( 80 % ) completed it after 6 months . Of the enrolled patients , 80 ( 88 % ) met prespecified inclusion criteria for analysis in the present study . The patients were r and omized to an insole or brace . Gait was analyzed with and without wearing the orthosis ( insole or brace ) at baseline and after 6 weeks . Measurements were taken of the knee adduction moment , ground reaction force , moment arm , walking speed , and toe-out angle . Data were analyzed with regression analyses based on an intention-to-treat principle . Results A mean reduction of 4 % ( ± 10 ) ( 95 % confidence interval [ CI ] , −0.147 to −0.03 , p = 0.003 ) of the peak knee adduction moment and 4 % ( ± 13 ) ( 95 % CI , −0.009 to −0.001 , p = 0.01 ) of the moment arm at baseline was observed in the insole group when walking with an insole was compared with walking without an insole . A mean reduction of 1 % ( ± 10 ) ( 95 % CI , −0.002 to −0.001 , p = 0.001 ) of the peak knee adduction moment and no reduction of the moment arm were measured after 6 weeks . No reduction of knee adduction moment , moment arm , or ground reaction force was seen in the brace group at baseline and after 6 weeks . Subgroup analysis showed no differences in biomechanical effect for obesity , stage of OA , and whether patients showed a clinical response to the treatment . Conclusions Laterally wedged insoles unload the medial compartment only at baseline in patients with varus alignment and by an amount that might not be clinical ly important . No biomechanical alteration was seen after 6 weeks of wearing the insole . Valgus brace therapy did not result in any biomechanical alteration . Taken together , this study does not show a clinical ly relevant biomechanical effect of insole and brace therapy in patients with varus medial knee OA.Level of Evidence Level I , therapeutic study . See Instructions for Authors for a complete description of levels of evidence Wedged insoles are believed to be of clinical benefit to individuals with knee osteoarthritis by reducing the knee adduction moment ( KAM ) during gait . However , previous clinical trials have not specifically controlled for KAM reduction at baseline , thus it is unknown if reduced KAMs actually confer a clinical benefit . Forty-eight participants with medial knee osteoarthritis were r and omly assigned to either a control group where no footwear intervention was given , or a wedged insole group where KAM reduction was confirmed at baseline . KAMs , Knee Injury and Osteoarthritis Outcome Score ( KOOS ) and Physical Activity Scale for the Elderly ( PASE ) scores were measured at baseline . KOOS and PASE surveys were re-administered at three months follow-up . The wedged insole group did not experience a statistically significant or clinical ly meaningful change in KOOS pain over three months ( p=0.173 ) . Furthermore , there was no association between change in KAM magnitude and change in KOOS pain over three months within the wedged insole group ( R2=0.02 , p=0.595 ) . Improvement in KOOS pain for the wedged insole group was associated with worse baseline pain , and a change in PASE score over the three month study ( R2=0.57 , p=0.007 ) . As an exploratory comparison , there was no significant difference in change in KOOS pain ( p=0.49 ) between the insole and control group over three months . These results suggest that reduced KAMs do not appear to provide any clinical benefit compared to no intervention over a follow-up period of three months . Clinical Trials.gov ID Number : NCT02067208 Objective To assess the effect of lateral wedge insoles compared with flat control insoles on improving symptoms and slowing structural disease progression in medial knee osteoarthritis . Design R and omised controlled trial . Setting Community in Melbourne , Australia . Participants 200 people aged 50 or more with clinical and radiographic diagnosis of mild to moderately severe medial knee osteoarthritis . Interventions Full length 5 degree lateral wedged insoles or flat control insoles worn inside the shoes daily for 12 months . Main outcome measures Primary symptomatic outcome was change in overall knee pain ( past week ) measured on an 11 point numerical rating scale . Primary structural outcome was change in volume of medial tibial cartilage from magnetic resonance imaging scans . Secondary clinical outcomes included changes in measures of pain , function , stiffness , and health related quality of life . Secondary structural outcomes included progression of medial cartilage defects and bone marrow lesions . Results Between group differences did not differ significantly for the primary outcomes of change in overall pain ( −0.3 points , 95 % confidence intervals −1.0 to 0.3 ) and change in medial tibial cartilage volume ( −0.4 mm3 , 95 % confidence interval −15.4 to 14.6 ) , and confidence intervals did not include minimal clinical ly important differences . None of the changes in secondary outcomes showed differences between groups . Conclusion Lateral wedge insoles worn for 12 months provided no symptomatic or structural benefits compared with flat control insoles . Trial registration Australian New Zeal and Clinical Trials Registry ACTR12605000503628 and Clinical Trials.gov NCT00415259 Kakihana W , Akai M , Nakazawa K , Naito K , Torii S : Inconsistent knee varus moment reduction caused by a lateral wedge in knee osteoarthritis . Am J Phys Med Rehabil 2007;86:446–454 . Objective : To determine — with the assistance of a larger sample size — whether the inconsistency of reducing the knee-joint varus moment with a lateral wedge in patients with medial compartment knee osteoarthritis ( OA ) persists and if so , what underlying mechanisms may explain it . Design : Crossover design whereby 51 patients with bilateral isolated medial compartment knee OA and 19 age-matched healthy controls walked with two different wedge conditions : a 0-degree control wedge and a 6-degree lateral wedge . We conducted three-dimensional motion analysis , hip – knee – ankle ( HKA ) angle measurement , and radiologic assessment with Kellgren – Lawrence grade . We investigated frontal plane angles and moments at the knee and subtalar joints , ground reaction forces , and center of pressure ( CoP ) . Moments were derived using a three-dimensional inverse dynamics model of the lower extremity . Results : Nine patients ( 17.6 % ) had an increased knee-joint varus moment with the 6-degree lateral wedge via the medially shifted location of the CoP. These patients did not differ from the remaining patients in HKA angle and radiologic assessment . Conclusion : In approximately 18 % of patients with bilateral medial compartment knee OA , the 6-degree lateral wedge seems to fail to reduce the knee-joint varus moment . The indication and limitations of lateral wedge should be confirmed by a r and omized controlled study OBJECTIVE To assess immediate effects of laterally wedged insoles on walking pain , external knee adduction moment , and static alignment , and whether these immediate effects together with age , body mass index , and disease severity predict clinical outcome after 3 months of wearing insoles in medial knee osteoarthritis . METHODS Forty volunteers ( mean age 64.7 years , 16 men ) were tested in r and om order with and without a pair of 5 degrees full-length lateral wedges . Immediate changes in static alignment were measured via radiographic mechanical axis and changes in adduction moment via 3-dimensional gait analysis . After 3 months of treatment with insoles , changes in pain and physical functioning were assessed via the Western Ontario and McMaster Universities Osteoarthritis Index ( WOMAC ) and patient-perceived global change scores . RESULTS Reductions in the adduction moment occurred with insoles ( first peak mean [ 95 % confidence intervals ( 95 % CI ) ] -0.22 [ -0.28 , -0.15 ] Nm/body weight x height % ) , accompanied by a reduction in walking pain of approximately 24 % ( mean [ 95 % CI ] -1.0 [ -4.0 , 2.0 ] ) . Insoles had no mean effect on static alignment . Mean improvement in WOMAC pain ( P = 0.004 ) and physical functioning ( mean [ 95 % CI ] -6 [ -11 , -1 ] ) was observed at 3 months , with 25 ( 69 % ) and 26 ( 72 % ) of 36 individuals reporting global improvement in pain and functioning , respectively . Regression analyses demonstrated that disease severity , baseline functioning , and magnitude of immediate change in walking pain and the first peak adduction moment with insoles were predictive of clinical outcome at 3 months . CONCLUSION Lateral wedges immediately reduced knee adduction moment and walking pain but had no effect on static alignment . Although some parameters predicted clinical outcome , these explained only one-third of the variance , suggesting that other unknown factors are also important OBJECTIVE To determine the effects of lateral wedged insoles on knee kinetics and kinematics during walking , according to radiographic severity of medial compartment knee osteoarthritis ( OA ) . DESIGN A prospect i ve case control study of patients with medial compartment OA of the knee . SETTING Gait analysis laboratory in a university hospital . PARTICIPANTS Forty-six medial compartment knees with OA of 23 patients with bilateral disease and 38 knees of 19 age-matched healthy subjects as controls . INTERVENTIONS Not applicable . MAIN OUTCOME MEASURES We measured the peak external adduction moment at the knee during the stance phase of gait and the first acceleration peak after heel strike at the lateral side of the femoral condyles . Kellgren and Lawrence grading system was used for radiographic assessment of OA severity . RESULTS The mean value of peak external adduction moment of the knee was higher in OA knees than the control . Application of lateral wedged insoles significantly reduced the peak external adduction moment in Kellgren-Lawrence grade s I and II knee OA patients . The first acceleration peak value after heel strike in these patients was relatively high compared with the control . Application of lateral wedged insoles significantly reduced the first acceleration peak in Kellgren-Lawrence grade s I and II knee OA patients . CONCLUSIONS The kinetic and kinematic effects of wearing of lateral wedged insoles were significant in Kellgren-Lawrence grade s I and II knee OA . The results support the recommendation of use of lateral wedged insoles for patients with early and mild knee OA OBJECTIVE In uncontrolled studies , a lateral-wedge insole has reduced knee pain in patients with medial knee osteoarthritis ( OA ) . The aim of this study was to test the efficacy of this simple , low-cost intervention for pain in patients with medial knee OA . METHODS We conducted a double-blind , r and omized , crossover trial design ed to detect a small effect of treatment . Participants were at least 50 years of age and had medial joint space narrowing on posteroanterior semiflexed radiographs and scores indicating moderate pain for 2 of the 5 items on the Western Ontario and McMaster Universities Osteoarthritis Index ( WOMAC ) pain scale . Participants were r and omized to receive a 5 degrees lateral-wedge insole or a neutral insole for 6 weeks . Following a 4-week washout period , participants crossed over to the other treatment for 6 weeks . Knee pain , the primary outcome , was assessed by the WOMAC pain scale ( visual analog scale version ) . Secondary outcomes included the WOMAC disability subscale , overall knee pain , 50-feet walk time , chair-st and time , and use of medications for knee pain . RESULTS Ninety patients were r and omized . The mean difference in pain between the 2 treatments was 13.8 points on the WOMAC pain scale ( 95 % confidence interval -3.9 , 31.4 [ P=0.13 ] ) . We observed similar small effects for the secondary outcomes . CONCLUSION The effect of treatment with a lateral-wedge insole for knee OA was neither statistically significant nor clinical ly important The effect of a valgus knee brace and a lateral wedged insole on knee and ankle kinematics and kinetics was evaluated in ten patients with medial knee osteoarthritis ( OA ) . The knee orthosis was tested in two valgus adjustments ( 4 ° and 8 ° ) , and the laterally wedged insole was fabricated with an inclination of 4 ° . A motion capture system and force platforms were used for data collection and joint moments were calculated using inverse dynamics . The valgus moment applied by the orthosis was also measured using a strain gauge implemented in the orthosis ' rotational axis . For the second peak knee adduction moment , decreases of 18 % , 21 % , and 7 % were observed between baseline and test conditions for the orthosis in 4 ° valgus , in 8 ° valgus , and insole , respectively . Similar decreases were observed for knee lever arm in the frontal plane . Knee adduction angular impulse decreased 14 % , 18 % , and 7 % from baseline to conditions for the orthosis in 4 ° valgus , in 8 ° valgus , and insole , respectively . Knee angle in the frontal plane reached a more valgus position during gait using the valgus knee brace . The valgus moment applied by the orthosis with 8 ° valgus adjustment was 30 % higher than with 4 ° valgus adjustment . The valgus knee orthosis was more effective than the laterally wedged insole in reducing knee adduction moment in patients with medial knee OA OBJECTIVE Gait biomechanics ( knee adduction moment , center of pressure ) and static alignment were investigated to determine the mechanical effect of foot orthoses in people with medial compartment knee osteoarthritis . DESIGN Repeated measures design in which subjects were exposed to three conditions ( normal footwear , heel wedge and orthosis ) in r and om order . BACKGROUND The knee adduction moment is an indirect measure of medial compartment loading . It was hypothesized that the use of a 5 degrees valgus wedge and 5 degrees valgus modified orthosis would shift the center of pressure laterally during walking , thereby decreasing the adduction moment arm and the adduction moment . METHODS Peak knee adduction moment and center of pressure excursion were obtained in nine subjects with medial compartment knee OA during level walking using an optoelectric system and force plate . Static radiographs were taken in 12 subjects using precision radiographs . RESULTS There was no difference between conditions in static alignment , the peak adduction moment or excursion of the center of pressure in the medial-lateral direction . No relationship was found between the adduction moment and center of pressure excursion in the medial-lateral plane . The displacement of the center of pressure in the anterior-posterior direction , measured relative to the laboratory coordinate system , was decreased with the orthosis compared to the control condition ( P=0.036 ) and this measure was correlated with the adduction moment ( r=0.45 , P=0.019 ) . CONCLUSIONS The proposed mechanism was not supported by the findings . The reduction in the center of pressure excursion in the anterior-posterior direction suggests that foot positioning was altered , possibly to a toe-out position , while subjects wore the orthoses . Based on the current findings , we hypothesize that toe-out positioning may reduce medial joint load . RELEVANCE Knee Osteoarthritis is the most common cause of chronic disability amongst seniors . Developing inexpensive , non-invasive treatment strategies for this large population has potential to impact health care costs , quality of life and clinical outcomes UNLABELLED This study compared immediate changes in knee and ankle/subtalar biomechanics with lateral wedge orthotics with and without custom arch support in people with knee osteoarthritis and flat feet . Twenty-six participants with radiographic evidence of medial knee osteoarthritis ( 22 females ; age 64.0 years [ SD 8.0 years ] , BMI 27.2 kg/m(2 ) [ 4.2 ] ) and flat feet ( median foot posture index = + 5 ) underwent three-dimensional gait analysis for three conditions : Control ( no orthotic ) , lateral wedge , and lateral wedge plus arch support . Condition order was r and omized . Outcomes included frontal plane knee and ankle/subtalar biomechanics , and comfort . Compared to the control , lateral wedge and lateral wedge with arch support reduced the knee adduction moment impulse by 8 % and 6 % , respectively ( p < 0.05 ) . However , the lateral wedge result ed in a more everted foot position ( 4.3 degrees ) than lateral wedge plus arch support ( 3.2 degrees ) ( p < 0.05 ) . In contrast , lateral wedge plus arch support reduced foot frontal plane excursion compared to other conditions ( p < 0.05 ) . Participants self-reported significantly more immediate comfort with lateral wedge plus arch support compared to the control , whereas there was no difference in self-reported comfort between lateral wedge and control . No immediate changes in knee pain were observed in any condition . CLINICAL SIGNIFICANCE Rather than prescribing lateral wedges to all patients with knee osteoarthritis , those who have medial knee osteoarthritis and flat feet may prefer to use the combined orthotic to reduce loads across the knee , and to minimize the risk of foot and ankle symptoms as a consequence of orthotic treatment . © 2016 Orthopaedic Research Society . Published by Wiley Periodicals , Inc. J Orthop Res 34:1597 - 1605 , 2016 Background : There is contradictory evidence regarding whether the addition of medial arch supports to laterally wedged insoles reduces knee adduction moment , improves comfort , and reduces knee pain during the late stance phase of gait . Objectives : To verify if such effects occur in participants with medial knee osteoarthritis . Study design : R and omized single-blinded study . Methods : Gait analysis was performed on 18 patients affected by medial knee osteoarthritis . Pain and comfort scores , frontal plane kinematics and kinetics of ankle , knee , and hip were compared in four conditions : without foot orthosis , with foot orthoses , with medial arch support , and with foot orthoses with medial arch support and lateral wedge insoles with 6 ° and 10 ° inclination . Results : Lower-extremity gait kinetics were characterized by a significant decrease , greater than 6 % , in second peak knee adduction moment in laterally wedged insole conditions compared to the other conditions ( p < 0.001 ; effect size = 0.6 ) . No significant difference in knee adduction moment was observed between laterally wedged insole conditions . In contrast , a significant increase of 7 % in knee adduction moment during the loading response was observed in the customized foot orthoses without lateral inclination condition ( p < 0.001 ; effect size = 0.3 ) . No difference was found in comfort or pain ratings between conditions . Conclusion : Our study suggests that customized foot orthoses with a medial arch support may only be suitable for the management of medial knee osteoarthritis when a lateral wedge is included . Clinical relevance Our data suggest that customized foot orthoses with medial arch support and a lateral wedge reduce knee loading in patients with medial knee osteoarthritis ( KOA ) . We also found evidence that medial arch support may increase knee loading , which could potentially be detrimental in KOA patients
11,027
27,903,238
According to the checklist the most comprehensive and ready to implement version of cognitive stimulation was Cognitive Stimulation Therapy . Conclusions Reports of interventions rarely include consideration of implementation in practice . This study was able to show that the ImpRess checklist was feasible in practice and reliable . The checklist may be useful in evaluating readiness for implementation for other manualised interventions
Background Research reporting results of clinical trials , psychosocial or technological interventions frequently omit critical details needed to inform implementation in practice . The aim of this article is to develop an Implementation Readiness ( ImpRess ) checklist , that includes criteria deemed useful in measuring readiness for implementation and apply it to trials of cognitive stimulation in dementia , providing a systematic review of their readiness for widespread implementation .
Background Currently available pharmacological and non-pharmacological treatments have shown only modest effects in slowing the progression of dementia . Our objective was to assess the impact of a long-term non-pharmacological group intervention on cognitive function in dementia patients and on their ability to carry out activities of daily living compared to a control group receiving the usual care . Methods A r and omized , controlled , single-blind longitudinal trial was conducted with 98 patients ( follow-up : n = 61 ) with primary degenerative dementia in five nursing homes in Bavaria , Germany . The highly st and ardized intervention consisted of motor stimulation , practice in activities of daily living , and cognitive stimulation ( acronym MAKS ) . It was conducted in groups of ten patients led by two therapists for 2 hours , 6 days a week for 12 months . Control patients received treatment as usual . Cognitive function was assessed using the cognitive subscale of the Alzheimer 's Disease Assessment Scale ( ADAS-Cog ) , and the ability to carry out activities of daily living using the Erlangen Test of Activities of Daily Living ( E-ADL test ) at baseline and after 12 months . Results Of the 553 individuals screened , 119 ( 21.5 % ) were eligible and 98 ( 17.7 % ) were ultimately included in the study . At 12 months , the results of the per protocol analysis ( n = 61 ) showed that cognitive function and the ability to carry out activities of daily living had remained stable in the intervention group but had decreased in the control patients ( ADAS-Cog : adjusted mean difference : -7.7 , 95 % CI -14.0 to -1.4 , P = 0.018 , Cohen 's d = 0.45 ; E-ADL test : adjusted mean difference : 3.6 , 95 % CI 0.7 to 6.4 , P = 0.015 , Cohen 's d = 0.50 ) . The effect sizes for the intervention were greater in the subgroup of patients ( n = 50 ) with mild to moderate disease ( ADAS-Cog : Cohen 's d = 0.67 ; E-ADL test : Cohen 's d = 0.69 ) . Conclusions A highly st and ardized , non-pharmacological , multicomponent group intervention conducted in a nursing-home setting was able to postpone a decline in cognitive function in dementia patients and in their ability to carry out activities of daily living for at least 12 months . Trial Registration http://www.is rct n.com Identifier : IS RCT BACKGROUND Reality orientation therapy combined with cholinesterase inhibitors has not been evaluated in patients with Alzheimer 's disease . AIMS To perform such an evaluation . METHOD We r and omly assigned 79 of 156 patients treated with donepezil to receive a reality orientation programme . Caregivers of the treatment group were trained to offer the programme at home 3 days a week , 30 min/day , for 25 consecutive weeks , and were invited to stimulate and involve patients in reality-based communication . RESULTS The treatment group showed a slight improvement in Mini-Mental State Examination ( MMSE ) scores ( mean change + 0.2 , s.e.=0.4 ) compared with a decline in the control group ( mean change -1.1 , s.e.=0.4 ; P=0.02 ) . Similarly for the Alzheimer 's Disease Assessment Scale -- Cognition ( treatment group mean change + 0.4 , s.e.=0.8 ; control group -2.5 , s.e.=0.8 ; P=0.01 ) . The intervention had an equal effect on cognition in those with mild ( MMSE score > or = 20 ) and moderate ( score < 20 ) dementia . No significant effect was observed for behavioural and functional outcomes . CONCLUSIONS Reality orientation enhances the effects of donepezil on cognition in Alzheimer 's disease We compared reality orientation with reminiscence therapy for elderly people in a large residential home , using a controlled cross-over design . Both kinds of therapy group were enjoyed by both staff and residents , and enabled staff to get to know moderately and severely confused residents . The group that received reality orientation followed by reminiscence therapy showed improvement in cognitive and behavioural measures which was not found in the other two groups . It may be important to use reality orientation techniques with confused residents before involving them in a reminiscence group Background The Promoting Action on Research Implementation in Health Services framework , or PARIHS , is a conceptual framework that posits key , interacting elements that influence successful implementation of evidence -based practice s. It has been widely cited and used as the basis for empirical work ; however , there has not yet been a literature review to examine how the framework has been used in implementation projects and research . The purpose of the present article was to critically review and synthesize the literature on PARIHS to underst and how it has been used and operationalized , and to highlight its strengths and limitations . Methods We conducted a qualitative , critical synthesis of peer- review ed PARIHS literature published through March 2009 . We synthesized findings through a three-step process using semi-structured data abstract ion tools and group consensus . Results Twenty-four articles met our inclusion criteria : six core concept articles from original PARIHS authors , and eighteen empirical articles ranging from case reports to quantitative studies . Empirical articles generally used PARIHS as an organizing framework for analyses . No studies used PARIHS prospect ively to design implementation strategies , and there was generally a lack of detail about how variables were measured or mapped , or how conclusions were derived . Several studies used findings to comment on the framework in ways that could help refine or vali date it . The primary issue identified with the framework was a need for greater conceptual clarity regarding the definition of sub-elements and the nature of dynamic relationships . Strengths identified included its flexibility , intuitive appeal , explicit acknowledgement of the outcome of ' successful implementation , ' and a more expansive view of what can and should constitute ' evidence . ' Conclusions While we found studies reporting empirical support for PARIHS , the single greatest need for this and other implementation models is rigorous , prospect i ve use of the framework to guide implementation projects . There is also need to better explain derived findings and how interventions or measures are mapped to specific PARIHS elements ; greater conceptual discrimination among sub-elements may be necessary first . In general , it may be time for the implementation science community to develop consensus guidelines for reporting the use and usefulness of theoretical frameworks within implementation studies Several data suggest that physical activity and cognitive stimulation have a positive effect on the quality of life ( QoL ) of people with Alzheimer ’s disease ( AD ) , slowing the decline due to the disease . A pilot project was undertaken to assess the effect of cognitive stimulation , physical activity , and socialization on patients with AD and their informal caregiver ’s QoL and mood . Fourteen patients with AD were r and omly divided into active treatment group and control group . At the end of treatment , a significant improvement in apathy , anxiety , depression , and QoL in the active treatment group was found . Considering caregivers , those of the active treatment group exhibited a significant improvement in their mood and in their perception of patients ’ QoL. This study provides evidence that a combined approach based on cognitive stimulation , physical activity , and socialization is a feasible tool to improve mood and QoL in patients with AD and their caregivers BACKGROUND Psychological therapy groups for people with dementia are widely used , but their cost-effectiveness has not been explored . AIMS To investigate the cost-effectiveness of an evidence -based cognitive stimulation therapy ( CST ) programme for people with dementia as part of a r and omised controlled trial . METHOD A total of 91 people with dementia , living in care homes or the community , received a CST group intervention twice weekly for 8 weeks ; 70 participants with dementia received treatment as usual . Service use was recorded 8 weeks before and during the 8-week intervention and costs were calculated . A cost-effectiveness analysis was conducted with cognition as the primary outcome , and quality of life as the secondary outcome . Cost-effectiveness acceptability curves were plotted . RESULTS Cognitive stimulation therapy has benefits for cognition and quality of life in dementia , and costs were not different between the groups . Under reasonable assumptions , there is a high probability that CST is more cost-effective than treatment as usual , with regard to both outcome measures . CONCLUSIONS Cognitive stimulation therapy for people with dementia has effectiveness advantages over , and may be more cost-effective than , treatment as usual Current methodology has made popular the pragmatic r and omized trial ( Foster & Little , 2012 ) and has led to an acknowledged growth of high- quality research in the ‘ new generation ’ of psychosocial interventions in dementia care ( Orrell , 2012 ) . However , many trials of psychosocial interventions in dementia lack impact . It is not clear whether this reflects a genuine ineffectiveness , since the methodology used in dementia care research is at present weak in addressing the important distinction between genuine ineffectiveness and an implementation error . Therefore , potentially effective dementia care interventions that may have a positive impact on patient experience can fail to show effectiveness , thereby reducing treatment options and wasting research money . Method ological discussion s about the success and failure of interventions in dementia care within the European INTERDEM group of multi-professional research ers on psychosocial dementia care ( www.interdem.org/ ) led to a critical analysis of the suitability of the current methodology . This has often relied on the Medical Research Council ’s ( MRC ) Framework for complex interventions and its iterations ( Campbell et al. , 2000 ; Craig et al. , 2008 ; Craig & Petticrew , 2013 ) . Foster and Little ( 2012 ) pinpoint an important obstacle to improving clinical practice as the legacy of ‘ simple ’ drug therapy evaluations which are engrained in the design and conduct of the pragmatic trial , a view that can also be extended more generally to the interpretation and conduct of the MRC framework for complex interventions and its method ologies . The advantage of pragmatic trials is their proximity to daily practice in routine health care setting s. This has particular implication s for the design and conduct of psychosocial intervention research in dementia , where the important distinction between poor attention to implementation and genuine ineffectiveness is often overlooked . Furthermore , better attention to the implication s of this proximity in the design and conduct of psychosocial intervention research trials may reduce the need for a separate large-scale implementation study phase in the way that this is currently conceived within the MRC approach to complex interventions ( Craig et al. , 2008 ; Craig & Petticrew , 2013 ) . This is an urgent issue for psychosocial intervention research in dementia care as today 35.6 million people and families worldwide live with dementia ( Prince et al. , 2013 ) , whilst major pharmaceutical advances in prevention and cure remain elusive ( Miller , 2012 ) . The predicted future growth of dementia cases over the next decade will undoubtedly raise the already high costs of care , estimated at US$ 604 billion in 2010 ( Wimo et al. , 2013 ) , in an illness that is most feared people over the age of 55 years ( Le Couteur , Doust , Creasey , & Brayne , 2013 ) , since it can undermine core human capacities result ing in decline in quality of life ( Selkoe , 2012 ) . Although cure for dementia is not available for dementia patients , we are not empty h and ed . Psychosocial intervention research has emerged as today ’s forerunner , given its aim to improve daily practice among professionals who provide support to people and families living with dementia . Psychosocial interventions in dementia involve interactions between people to improve psychological and social functioning ( Moniz-Cook , Vernooij-Dassen , Woods , & Orrell , 2011 ; Rabins et al. , 2007 ) , such as cognitive stimulation therapy ( Woods , Aguirre , Spector , & Orrell , 2012 ) , occupational therapy ( Graff et al. , 2006 ) and support programmes for family carers ( Vernooij-Dassen , Draskovic , McCleery , & Downs , 2011 ) . This paper outlines the rationale for a paradigm shift in the design and methodology for evaluation of complex interventions in applied dementia care research . We use psychosocial intervention research as an exemplar of how research ers and their funders may achieve better value for money This r and omized study evaluated the combined effect of a cognitive-communication program plus an acetylcholinesterase inhibitor ( donepezil ; donepezil-plus-stimulation group ; n = 26 ) , as compared with donepezil alone ( donepezil-only group ; n = 28 ) in 54 patients with mild to moderate Alzheimer 's disease ( AD ; Mini-Mental Status Examination score of 12- 28 ) ranging in age from 54 to 91 years . It was hypothesized that cognitive-communication stimulation in combination with donepezil would positively affect the following : ( a ) relevance of discourse , ( b ) performance of functional abilities , ( c ) emotional symptoms , ( d ) quality of life , and ( e ) overall global function , as measured by caregiver and participant report and st and ardized measures . Cognitive-communication , neuropsychiatric , functional performance , and quality of life evaluations were conducted at baseline and Month 4 , the month after the 2-month active stimulation period . Follow-up evaluations were performed at Months 8 and 12 . The stimulation program consisted of 12 hr of intervention over an 8-week period and involved participant-led discussion s requiring homework , interactive sessions about AD , and discussion s using salient life stories . Additive effects of active stimulation with donepezil were examined in 2 ways : ( 1 ) comparing mean group performance over time and ( 2 ) evaluating change scores from baseline . A Group x Time interaction was found for the donepezil-plus-stimulation group in the emotional symptoms of apathy and irritability as compared with the donepezil-only group . Evaluation of change scores from baseline to 12 months revealed a positive effect for the donepezil-plus-stimulation group on discourse and functional abilities with a trend on apathy , irritability , and patient-reported quality of life . In sum , the research revealed benefits to the donepezil-plus-stimulation group in the areas of discourse abilities , functional abilities , emotional symptoms , and overall global performance . This study adds to growing evidence that active cognitive stimulation may slow the rate of verbal and functional decline and decrease negative emotional symptoms in AD when combined with acetylcholinesterase inhibitors , indicating a need to advance research in the area of cognitive treatments . The fact that AD is a progressive brain disease should not preclude ameliorative treatment Objective : To study the efficacy of cognitive rehabilitation combined with acetylcholinesterase inhibitor ( AChE-I ) treatment in patients with mild Alzheimer 's disease and their relatives . Method : Thirteen patients with mild Alzheimer 's disease treated with rivastigmine 6 - 12 mg/day for more than two months started cognitive rehabilitation training . Before and after the cognitive rehabilitation training patients were assessed through cognitive tests , activities of daily living scale , neuropsychological battery and scales to evaluate caregivers ' depressive and anxiety symptoms . Six patients were r and omized to a combined treatment group ( AChE-I plus cognitive rehabilitation and caregiver support ) and seven patients to a control group ( AChE-I only ) and followed up for five months . Results : Mini-Mental State Examination ( MMSE ) scores ( p=0.047 ) and backward digit span scores ( p=0.018 ) were significantly different between the groups on follow-up . The combined treatment group showed a better positive treatment effect on cognitive and neuropsychological tests applied to patients and reduction of psychiatric symptoms was observed in their caregivers ( nonsignificant ) . Conclusion : Cognitive rehabilitation associated with AChE-I treatment can potentially be useful to stabilize or improve cognitive and functional performance of patients with mild Alzheimer 's disease and can reduce caregivers ' psychiatric symptoms Recent studies have shown that patients with Alzheimer 's disease ( AD ) and its possible prodromal stage mild cognitive impairment benefit from cognitive interventions . Few studies so far have used an active control condition and determined effects in different stages of disease . We evaluated a newly developed 6-month group-based multicomponent cognitive intervention in a r and omized controlled pilot study on subjects with amnestic mild cognitive impairment ( aMCI ) and mild AD patients . Forty-three subjects with aMCI and mild AD were recruited . Primary outcome measures were change in global cognitive function as determined by the Alzheimer 's Disease Assessment Scale-cognitive subscale ( ADAS-cog ) and the Mini Mental Status Examination ( MMSE ) . Secondary outcomes were specific cognitive and psychopathological ratings . Thirty-nine patients were r and omized to intervention groups ( IGs : 12 aMCI , 8 AD ) and active control groups ( CGs : 12 aMCI , 7 AD ) . At the end of the study , we found significant improvements in the IG(MCI ) compared to the CG(MCI ) in the ADAS-cog ( p = 0.02 ) and for the secondary endpoint Montgomery Asberg Depression Rating Scale ( MADRS ) ( p < 0.01 ) Effects on the MMSE score showed a non-significant trend ( p = 0.07 ) . In AD patients , we found no significant effect of intervention on the primary outcome measures . In conclusion , these results suggest that participation in a 6-month cognitive intervention can improve cognitive and non-cognitive functions in aMCI subjects . In contrast , AD patients showed no significant benefit from intervention . The findings in this small sample support the use of the intervention in larger scales studies with an extended follow-up period to determine long-term effects A study was performed on patients with Alzheimer ’s disease ( AD ) in order to evaluate the efficacy of a combined treatment ( donepezil plus cognitive training ) in both cognitive processes and affective states . Eighty-six subjects , 25 men and 61 women , with an average age of 75.58 years , were studied . Almost all the subjects had a basic educational level . Donezepil was administered at a dose of 10 mg daily along with cognitive treatment involving images of everyday life and reminiscent music ; the sessions took place on Monday to Friday and lasted three quarters of an hour . The study lasted 12 months . Subjects underwent test-retest with the following tests : Mini-Mental State Examination ( MMSE ) , the cognitive subscale of the Alzheimer ’s Disease Assessment Scale ( ADAS-cog ) ; the Geriatric Depression Scale ( GDS ) and the overall deterioration scale ( FAST ) . The results showed that subjects receiving the combined treatment had a better response than those who did not receive any cognitive training . These subjects ’ MMSE score decreased by 3.24 on average . The affective symptomatology of those receiving only drug treatment improved whereas the cognitive processes did not Sixty long-stay patients who were demented or withdrawn or both were r and omly allocated to reality orientation or diversional occupational therapy and were assessed blind on two scales . Owing to a high drop-out rate only 19 organic and 19 functional patients satisfied reasonable criteria for inclusion in the analyses . Those treated by reality orientation therapy fared better than the controls but not significantly so . Initial test scores were no guide to outcome . The results suggested that both organic and functional patients benefited cognitively , but hardly at all behaviourally , from both types of treatment Classroom Reality Orientation was investigated with a sample of dements drawn from a psychogeriatric hospital and an old people 's home . Treatment effects were examined on a range of cognitive and behavioural measures , and compared with the effects of a specific behaviourally directed orientation training in the ward . Class RO improved cognitive functioning but not behaviour , irrespective of degree of dementia . Highly significant behavioural change was demonstrated for ward behaviour orientation training and this technique is not only cheaper but has potential in managing dements OBJECTIVES Cognitive stimulation therapy ( CST ) has been shown to produce improvements in cognition and quality of life which compare favourably with trials of cholinesterase inhibitors . The aim of the present study was to evaluate the efficacy of CST , replicating the methods of Spector et al in the British Journal of Psychiatry in 2003 in a smaller sample using a control group engaged in routine activities . METHODS Eligible participants ( mild to moderate dementia ; MMSE range 10 - 23 ) were r and omised to CST group or control conditions . Pre- and post-intervention testing was undertaken by assessors who were blind to condition . Measures included MMSE , CDR ( sum of boxes ) , ADAS-cog , RAID ( anxiety ) , abbreviated GDS ( depression ) , QoL-AD , and the CAPE Behaviour Rating Scale ( BRS ) . Analysis was by non-parametric statistics . Occupational therapists facilitated two sessions per week for seven weeks in two long-term care facilities and the same programme was run by the activity co-ordinator in a nursing home unit . RESULTS Fourteen CST and 13 control participants completed the study . Between group difference scores analysis showed that the CST group improved compared to controls on MMSE ( Mann-Whitney U = 32 , p = 0.013 ) and on the QoL-AD which just fell short of significant ( U=51.5 , p = 0.055 ) . Qualitatively , therapists noted that CST participants demonstrated good interaction and enthusiasm in the group environment , with continuity and carryover between sessions . CONCLUSIONS Even though the sample sizes are small the current study is consistent with the Spector et al 's findings in 2003 of beneficial effects in people with dementia following CST . The programme is recommended as an intervention for people with mild to moderate dementia
11,028
10,913,051
CONCLUSION From the point of view of evidence based medicine it should be seriously question ed whether yttrium synovectomy deserves a place in clinical practice
OBJECTIVE To consider the question : How strong is the evidence in favour of yttrium synovectomy in chronic knee arthritis in patients with rheumatoid arthritis in comparison with placebo and intra-articular steroid treatment ?
A restricted sequential design multicentre controlled trial of yttrium-90 against triamcinolone intra-articularly was undertaken in patients with rheumatoid arthritis with knee involvement . The trial had to be discontinued because of dwindling recruitment over time . The reasons for this and other features contributing to an inconclusive outcome are noted . This experience lends little encouragement to the idea that yttrium-90 therapy is more or less advantageous than triamcinolone hexacetonide The use of irradiation of the synovium in the treatment of rheumatoid arthritis is presented . A colloidal resin of radioactive yttrium ( 90Y ) or saline was injected intra-articularly into 44 knee joints under double blind conditions . Patients selected for treatment had chronic effusions which were resistant to all forms of therapy , other than surgical synovectomy . All patients were followed up for one year . There was a sustained improvement in joint range in 13 cases ( 57 per cent ) and of knee circumference in eight cases ( 35 per cent ) ( P < 0.005 and < 0.05 respectively ) compared with the control . In seven patients ( 30 per cent ) the joint effusion had completely resolved . Intra-articular yttrium had no advantage over saline for the prevention of radiological deterioration of the joint . Irradiation of the synovium produced a significant reduction in the acid phosphatase concentration of the synovial fluid . One-third of the patients developed general and local reactions to the 90Y injection Irradiation of the synovium for the treatment of chronic knee effusions associated with rheumatoid arthritis and allied conditions is being increasingly used as an alternative to surgical synovectomy . Intraarticular injections of a colloidal solution of radioactive gold ( 19"8Au ) has been tried with some success ( Ansell , Crook , Mallard , and Bywaters , 1963 ; Virkkunen , Krusius , and Heiskanen , 1967 ; Makin and Robin , 1968 ; Grahame , Ramsey , and Scott , 1970 ) . However , as 198Au has a beta-particle maximum range in tissue of only 4 mm . and as the synovium in these chronic knee effusions can attain a thickness of greater than 1 cm . , a complete synovial ablation is not produced . A serious disadvantage of intraarticular 198Au is that there may be marked leakage from the joint to the regional lymph nodes , the liver , and spleen in as many as 36 per cent . of patients so treated ( Virkkunnen and others , 1967 ) . Moreover , radioactive gold possesses a significant gamma-ray component and this gives an unwanted whole body radiation to the patient . Because of these disadvantages it has been suggested that radioactive yttrium ( 90Y ) may be a more suitable agent ( Ansell and others , 1963 ; Grahame and others , 1970 ) . 90Y is a pure beta-emitter with a maximum range in tissue of 10 mm . It has a higher maximum energy than gold , 2 26 MeV compared with 0 9 MeV , and it has a half-life of 2 - 7 days . The whole body distribution of 90Y silicate and 90Y resin after intraarticular injections has been compared ( Prichard , Bridgman , and Bleehen , 1970 ) . It was found that silicate preparations were associated with an appreciable leakage from the joint , but that the resin remained localized . It has also been shown that the 90Y resin was taken up more or less evenly in both normal and inflamed rabbit synovia ( Webb , Lowe , and Bluestone , 1969 ) . We have therefore undertaken a controlled double-blind trial , using intra-articular 90Y resin in the treatment of chronic knee effusions . The preliminary results after assessment of all cases 6 months after the intra-articular injections are presented In 66 patients with rheumatoid knee joint synovitis and hydrops and with only slight radiological destruction , local treatment of the knee was r and omly performed either with osmic acid , radioactive yttrium or surgical synovectomy . After a one-year follow-up the clinical and radiological results were slightly better in patients who underwent surgical synovectomy than in other treatment groups . However , radiological osteoarthrosis had progressed more in synovectomized patients than in others . In 2 patients with radioactive yttrium a relapse requiring surgical synovectomy was seen . Because osmic acid is almost as effective as surgical synovectomy , is very cheap and easy to perform , it can be recommended as the first choice for local therapy in patients with corticosteroid-restant knee joint synovitis , in the early stage of the disease Most systematic review s rely substantially on the assessment of the method ological quality of the individual trials . The aim of this study was to obtain consensus among experts about a set of generic core items for quality assessment of r and omized clinical trials ( RCTs ) . The invited participants were experts in the field of quality assessment of RCTs . The initial item pool contained all items from existing criteria lists . Subsequently , we reduced the number of items by using the Delphi consensus technique . Each Delphi round comprised a question naire , an analysis , and a feedback report . The feedback report included staff team decisions made on the basis of the analysis and their justification . A total of 33 international experts agreed to participate , of whom 21 completed all question naires . The initial item pool of 206 items was reduced to 9 items in three Delphi rounds . The final criteria list ( the Delphi list ) was satisfactory to all participants . It is a starting point on the way to a minimum reference st and ard for RCTs on many different research topics . This list is not intended to replace , but rather to be used alongside , existing criteria lists In an open , controlled trial , radioactive 90Yttrium was injected in doses of 3 - 6 mCi into 40 joints with chronic effusions in patients suffering from rheumatoid arthritis and other rheumatoid diseases . The processing in a computer was carried out by the punch card method . Intra-articular administration of 90Y result ed in a significant improvement of five objective criteria . In our study we have paid particular attention to the clinical and statistical differentiation of the effects of systemic , and especially of basic therapy , from the effects of topical intra-articular therapy . At the time of the last check-up examination , 43 % of the patients were free of any effusion . The treatment was well tolerated . Skin necrosis occurred in only one single case . Intra-articular treatment with radioactive 90Yttrium represents a valuable contribution to the therapeutic arsenal for the treatment of the chronic articular effusions when all other methods of treatment failed The effectiveness of osmic acid and yttrium-90 in the treatment of synovitis of the knee in rheumatoid arthritis is compared in 126 patients followed-up for 3 years . Ninety-one knees were injected with osmic acid and eighty-four knees with yttrium-90 . Osmic acid appeared to be more effective than yttrium-90 throughout the period of the follow-up but the difference only reached statistical significance ( p less than 0.05 ) at 3 years . Both therapies were well tolerated by patients and should be considered as an alternative to operative synovectomy
11,029
28,841,486
Our meta- analysis showed significantly beneficial effects on depressive and anxiety symptoms of BD patients in within-group analysis . However , this significance was not observed in comparison with the control groups .
BACKGROUND Mindfulness-based interventions ( MBIs ) have been increasingly used as an adjunctive treatment to pharmacotherapy for a few psychiatric disorders . However , few studies have investigated the efficacy of MBIs in bipolar disorder ( BD ) . METHODS We performed a systematic review and meta- analysis to evaluate the efficacy of MBIs as an adjunctive treatment in BD .
BACKGROUND Preliminary research findings have shown that mindfulness-based cognitive therapy improves anxiety and depressive symptoms in bipolar disorder . In this study , we further investigated the effects of MBCT in bipolar disorder , in a controlled fMRI study . METHOD Twenty three patients with bipolar disorder underwent neuropsychological testing and functional MRI . Sixteen of these patients were tested before and after an eight-week MBCT intervention , and seven were wait listed for training and tested at the same intervals . The results were compared with 10 healthy controls . RESULTS Prior to MBCT , bipolar patients reported significantly higher levels of anxiety and symptoms of stress , scored significantly lower on a test of working memory , and showed significant BOLD signal decrease in the medial PFC during a mindfulness task , compared to healthy controls . Following MBCT , there were significant improvements in the bipolar treatment group , in measures of mindfulness , anxiety and emotion regulation , and in tests of working memory , spatial memory and verbal fluency compared to the bipolar wait list group . BOLD signal increases were noted in the medial PFC and posterior parietal lobe , in a repeat mindfulness task . A region of interest analysis revealed strong correlation between signal changes in medial PFC and increases in mindfulness . LIMITATIONS The small control group is a limitation in the study . CONCLUSION These data suggest that MBCT improves mindfulness and emotion regulation and reduces anxiety in bipolar disorder , corresponding to increased activations in the medial PFC , a region associated with cognitive flexibility and previously proposed as a key area of pathophysiology in the disorder Background Bipolar disorder is highly recurrent and rates of comorbidity are high . Studies have pointed to anxiety comorbidity as one factor associated with risk of suicide attempts and poor overall outcome . This study aim ed to explore the feasibility and potential benefits of a new psychological treatment ( Mindfulness-based Cognitive Therapy : MBCT ) for people with bipolar disorder focusing on between-episode anxiety and depressive symptoms . Methods The study used data from a pilot r and omized trial of MBCT for people with bipolar disorder in remission , focusing on between-episode anxiety and depressive symptoms . Immediate effects of MBCT versus waitlist on levels of anxiety and depression were compared between unipolar and bipolar participants . Results The results suggest that MBCT led to improved immediate outcomes in terms of anxiety which were specific to the bipolar group . Both bipolar and unipolar participants allocated to MBCT showed reductions in residual depressive symptoms relative to those allocated to the waitlist condition . Limitations Analyses were based on a small sample , limiting power . Additionally the study recruited participants with suicidal ideation or behaviour so the findings can not immediately be generalized to individuals without these symptoms . Conclusions The study , although preliminary , suggests an immediate effect of MBCT on anxiety and depressive symptoms among bipolar participants with suicidal ideation or behaviour , and indicates that further research into the use of MBCT with bipolar patients may be warranted Weekly affective symptom severity and polarity were compared in 135 bipolar I ( BP I ) and 71 bipolar II ( BP II ) patients during up to 20 yr of prospect i ve symptomatic follow-up . The course of BP I and BP II was chronic ; patients were symptomatic approximately half of all follow-up weeks ( BP I 46.6 % and BP II 55.8 % of weeks ) . Most bipolar disorder research has concentrated on episodes of MDD and mania and yet minor and subsyndromal symptoms are three times more common during the long-term course . Weeks with depressive symptoms predominated over manichypomanic symptoms in both disorders ( 31 ) in BP I and BP II at 371 in a largely depressive course ( depressive symptoms=59.1 % of weeks vs. hypomanic=1.9 % of weeks ) . BP I patients had more weeks of cyclingmixed polarity , hypomanic and subsyndromal hypomanic symptoms . Weekly symptom severity and polarity fluctuated frequently within the same bipolar patient , in which the longitudinal symptomatic expression of BP I and BP II is dimensional in nature involving all levels of affective symptom severity of mania and depression . Although BP I is more severe , BP II with its intensely chronic depressive features is not simply the lesser of the bipolar disorders ; it is also a serious illness , more so than previously thought ( for instance , as described in DSM-IV and ICP-10 ) . It is likely that this conventional view is the reason why BP II patients were prescribed pharmacological treatments significantly less often when acutely symptomatic and during intervals between episodes . Taken together with previous research by us on the long-term structure of unipolar depression , we su bmi t that the thrust of our work during the past decade supports classic notions of a broader affective disorder spectrum , bringing bipolarity and recurrent unipolarity closer together . However the genetic variation underlying such a putative spectrum remains to be clarified OBJECTIVES People in the late stage of bipolar disorder ( BD ) experience elevated relapse rates and poorer quality of life ( QoL ) compared with those in the early stages . Existing psychological interventions also appear less effective in this group . To address this need , we developed a new online mindfulness-based intervention targeting quality of life ( QoL ) in late stage BD . Here , we report on an open pilot trial of ORBIT ( online , recovery-focused , bipolar individual therapy ) . METHODS Inclusion criteria were : self-reported primary diagnosis of BD , six or more episodes of BD , under the care of a medical practitioner , access to the internet , proficient in English , 18 - 65 years of age . Primary outcome was change ( baseline - post-treatment ) on the Brief QoL.BD ( Michalak and Murray , 2010 ) . Secondary outcomes were depression , anxiety , and stress measured on the DASS scales ( Lovibond and Lovibond , 1993 ) . RESULTS Twenty-six people consented to participate ( Age M=46.6 years , SD=12.9 , and 75 % female ) . Ten participants were lost to follow-up ( 38.5 % attrition ) . Statistically significant improvement in QoL was found for the completers , t(15)=2.88 , 95 % CI:.89 - 5.98 , p=.011 , ( Cohen׳s dz=.72 , partial η(2)=.36 ) , and the intent-to-treat sample t(25)=2.65 , 95 % CI:.47 - 3.76 , ( Cohen׳s dz=.52 ; partial η(2)=.22 ) . A non-significant trend towards improvement was found on the DASS anxiety scale ( p=.06 ) in both completer and intent-to-treat sample s , but change on depression and stress did not approach significance . LIMITATIONS This was an open trial with no comparison group , so measured improvements may not be due to specific elements of the intervention . Structured diagnostic assessment s were not conducted , and interpretation of effectiveness was limited by substantial attrition . CONCLUSION Online delivery of mindfulness-based psychological therapy for late stage BD appears feasible and effective , and ORBIT warrants full development . Modifications suggested by the pilot study include increasing the 3 weeks duration of the intervention , adding caution s about the impact of extended meditations , and addition of coaching support/monitoring to optimise engagement In many clinical setting s , there is a high comorbidity between substance use disorders , psychiatric disorders , and traumatic stress . Novel therapies are needed to address these co-occurring issues efficiently . The aim of the present study was to conduct a pragmatic r and omized controlled trial comparing Mindfulness-Oriented Recovery Enhancement ( MORE ) to group Cognitive-Behavioral Therapy ( CBT ) and treatment-as-usual ( TAU ) for previously homeless men residing in a therapeutic community . Men with co-occurring substance use and psychiatric disorders , as well as extensive trauma histories , were r and omly assigned to 10 weeks of group treatment with MORE ( n = 64 ) , CBT ( n = 64 ) , or TAU ( n = 52 ) . Study findings indicated that from pre-to post-treatment MORE was associated with modest yet significantly greater improvements in substance craving , post-traumatic stress , and negative affect than CBT , and greater improvements in post-traumatic stress and positive affect than TAU . A significant indirect effect of MORE on decreasing craving and post-traumatic stress by increasing dispositional mindfulness was observed , suggesting that MORE may target these issues via enhancing mindful awareness in everyday life . This pragmatic trial represents the first head-to-head comparison of MORE against an empirically-supported treatment for co-occurring disorders . Results suggest that MORE , as an integrative therapy design ed to bolster self-regulatory capacity , may hold promise as a treatment for intersecting clinical conditions This study evaluated mindfulness-based cognitive therapy ( MBCT ) , a group intervention design ed to train recovered recurrently depressed patients to disengage from dysphoria-activated depressogenic thinking that may mediate relapse/recurrence . Recovered recurrently depressed patients ( n = 145 ) were r and omized to continue with treatment as usual or , in addition , to receive MBCT . Relapse/recurrence to major depression was assessed over a 60-week study period . For patients with 3 or more previous episodes of depression ( 77 % of the sample ) , MBCT significantly reduced risk of relapse/recurrence . For patients with only 2 previous episodes , MBCT did not reduce relapse/recurrence . MBCT offers a promising cost-efficient psychological approach to preventing relapse/recurrence in recovered recurrently depressed patients BACKGROUND Bipolar disorder ( BD ) is a chronic and disabling psychiatric disorder characterized by recurrent episodes of mania/hypomania and depression . Dialectical behavior therapy ( DBT ) techniques have been shown to effectively treat borderline personality disorder , a condition also marked by prominent affective disturbances . The utility of DBT techniques in treating BD has been largely unexplored . The purpose of this research was to conduct a pilot study of a DBT-based psychoeducational group ( BDG ) in treating euthymic , depressed , or hypomanic Bipolar I or II patients . METHODS In this experiment , 26 adults with bipolar I or II were r and omized to intervention or wait-list control groups and completed the Beck depression inventory II , mindfulness-based self-efficacy scale , and affective control scale at baseline and 12 weeks . The BDG intervention consisted of 12 weekly 90-min sessions which taught DBT skills , mindfulness techniques , and general BD psychoeducation . RESULTS Using RM-ANOVA , subjects in BDG demonstrated a trend toward reduced depressive symptoms , and significant improvement in several MSES subscales indicating greater mindful awareness , and less fear toward and more control of emotional states ( ACS ) . These findings were supported with a larger sample of patients who completed the BDG . Furthermore , group attendees had reduced emergency room visits and mental health related admissions in the six months following BDG . LIMITATIONS The small sample size in RCT affects power to detect between group differences . How well improvements after the12-week BDG were maintained is unknown . CONCLUSIONS There is preliminary evidence that DBT skills reduce depressive symptoms , improve affective control , and improve mindfulness self-efficacy in BD . Its application warrants further evaluation in larger studies Introduction : Bipolar disorder is characterized by recurrent episodes of depression and /or mania along with interepisodic mood symptoms that interfere with psychosocial functioning . Despite periods of symptomatic recovery , many individuals with bipolar disorder continue to experience substantial residual mood symptoms that often lead to the recurrence of mood episodes BACKGROUND The present open study investigates the feasibility of Mindfulness-based cognitive therapy ( MBCT ) in groups solely composed of bipolar patients of various subtypes . MBCT has been mostly evaluated with remitted unipolar depressed patients and little is known about this treatment in bipolar disorder . METHODS Bipolar out patients ( type I , II and NOS ) were included and evaluated for depressive and hypomanic symptoms , as well as mindfulness skills before and after MBCT . Patients ' expectations before the program , perceived benefit after completion and frequency of mindfulness practice were also recorded . RESULTS Of 23 included patients , 15 attended at least four MBCT sessions . Most participants reported having durably , moderately to very much benefited from the program , although mindfulness practice decreased over time . Whereas no significant increase of mindfulness skills was detected during the trial , change of mindfulness skills was significantly associated with change of depressive symptoms between pre- and post-MBCT assessment s. CONCLUSIONS MBCT is feasible and well perceived among bipolar patients . Larger and r and omized controlled studies are required to further evaluate its efficacy , in particular regarding depressive and (hypo)manic relapse prevention . The mediating role of mindfulness on clinical outcome needs further examination and efforts should be provided to enhance the persistence of meditation practice with time
11,030
26,891,227
We conducted a systematic review selecting studies employing cognitive assessment s. Summary : We identified few studies using objective measurements to determine whether cognitive aspects associated with the frontal lobe correlate with dementia in this population . We observed a tendency toward such correlations .
Background : There is a proven link between Down syndrome and the early development of the neuropathological features of Alzheimer 's disease ( AD ) . Changes in the personality and behavior of adults with Down syndrome might indicate the early stages of dementia or of frontotemporal lobar degeneration . The objective of this study was to investigate the executive functions and changes in behavior associated with frontal lobe degeneration in individuals with Down syndrome who develop AD .
Article abstract —We examined the pattern of neuroanatomic abnormalities in adults with Down 's syndrome ( DS ) and the cognitive correlates of these abnormalities . Specifically , we compared this pattern with what would be predicted by the hypotheses attributing DS pathology to either premature aging or Alzheimer 's disease . We measured a number of brain regions on MRIs of 25 subjects : 13 persons with the DS phenotype and 12 age- and sex-matched healthy volunteers . Study participants had no history of cardiovascular disease , diabetes , thyroid dysfunction , or seizure disorder . After statistical adjustment for differences in body size , we found that , in comparison with controls , DS subjects had substantially smaller cerebral and cerebellar hemispheres , ventral pons , mammillary bodies , and hip-pocampal formations . In the cerebellar vermis of DS subjects , we observed smaller lobules VI to VIII without appreciable differences in other regions . In addition , we noted trends for shrinkage of the dorsolateral prefrontal cortex , anterior cingulate gyrus , inferior temporal and parietal cortices , parietal white matter , and pericalcarine cortex in DS subjects compared with normal controls . The parahippocampal gyrus was larger in DS subjects . We found no significant group differences in the volumes of the prefiontal white matter , the orbitofiontal cortex , the pre- and post central gyri , or the basal ganglia . We conclude that the pattern of selective cerebral damage in DS does not clearly fit the predictions of the premature aging or Alzheimer 's disease hypotheses . To examine the relationship between brain abnormalities and cognitive deficits observed in DS , we correlated the size of brain regions that were significantly reduced in DS with performance on tests of intelligence and language . The correlational analysis suggested age-related decline in the DS subjects in general intelligence and basic linguistic skills . General intelligence and mastery of linguistic concepts correlated negatively with the volume of the parahippocampal gyrus . There was no relationship between total brain size and the cognitive variables BACKGROUND While neuropathological studies indicate a high risk for Alzheimer 's disease in adults with Down 's syndrome , neuropsychological studies suggest a lower prevalence of dementia . In this study , cognitive deterioration in adults with Down 's syndrome was examined prospect ively over 4 years to establish rates and profiles of cognitive deterioration . METHODS Fifty-seven people with Down 's syndrome aged 30 years or older were assessed using a battery of neuropsychological tests on five occasions across 50 months . Assessment s of domains of cognitive function known to change with the onset of Alzheimer related dementia were employed . These included tests of learning , memory , orientation , agnosia , apraxia and aphasia . The individual growth trajectory methodology was used to analyse change over time . RESULTS Severe cognitive deterioration , such as acquired , apraxia and agnosia , was evident in 28.3 % of those aged over 30 and a higher prevalence of these impairments was associated with older age . The rate of cognitive deterioration also increased with age and degree of pre-existing cognitive impairment . Additionally , deterioration in memory , learning and orientation preceded the acquisition of aphasia , agnosia and apraxia . CONCLUSIONS The prevalence of cognitive impairments consistent with the presence of Alzheimer 's disease is lower than that suggested by neuropathological studies . The pattern of the acquisition of cognitive impairments in adults with Down 's syndrome is similar to that seen in individuals with Alzheimer 's disease who do not have Down 's syndrome Frontotemporal dementia ( FTD ) is often misdiagnosed as Alzheimer ’s disease ( AD ) . We hypothesized that the first symptoms associated with FTD would be different from those seen in AD and that the first symptoms in FTD would reflect loss of function in the frontal region with the greatest degree of degeneration . The objective of the study was to compare the earliest symptoms in patients with FTD and AD , and to delineate the symptoms that were associated with right , left or bilateral frontotemporal degeneration in FTD . The first symptoms in 52 FTD and 101 AD patients were determined in retrospect . Based on functional imaging studies , the FTD patients were divided into those with predominantly bilateral ( n = 15 ) , left-sided ( n = 19 ) and right-sided ( n = 18 ) patterns of atrophy . The results showed that disinhibition , social awkwardness , passivity and loss of executive function were more common in FTD , while memory loss was more common in AD . Disinhibition was greatest in the asymmetric right-sided group , language dysfunction was commonest in the asymmetric left-sided group and loss of executive function was most frequent in the bilateral group . In summary , different first symptoms appeared in FTD and AD , which may help distinguish between the diseases . The anatomic site for FTD largely determined the kind of first symptoms BACKGROUND The clinical and neuropathological features associated with dementia in Down 's syndrome ( DS ) are not well established . Aims To examine clinico-pathological correlations and the incidence of cognitive decline in a cohort of adults with DS . METHOD A total of 92 hospitalized persons with DS were followed up from 1985 to December 2000 . At outset , 87 participants were dementia-free , with a median age of 38 years . Assessment s included the Prudhoe Cognitive Function Test ( PCFT ) and the Adaptive Behavior Scale ( ABS ) , to measure cognitive and behavioural deterioration . Dementia was diagnosed from case records and caregivers ' reports . RESULTS Eighteen ( 21 % ) patients developed dementia during follow-up , with a median age of onset 55.5 years ( range 45 - 74 ) . The PCFT demonstrated cognitive decline among those with a less severe intellectual disability ( mild and moderate ) but not among the profoundly disabled people ( severe and profound ) . Clinical dementia was associated with neuropathological features of Alzheimer 's disease , and correlated with neocortical neurofibrillary tangle densities . At the age of 60 years and above , a little more than 50 % of patients still alive had clinical evidence of dementia . CONCLUSIONS Clinical dementia associated with measurable cognitive and functional decline is frequent in people with DS after middle age , and can be readily diagnosed among less severely intellectually disabled persons using measures of cognitive function such as the PCFT and behavioural scales such as the ABS . In the more profoundly disabled people , the diagnosis of dementia is facilitated by the use of behavioural and neurological criteria . In this study , the largest prospect i ve DS series including neuropathology on deceased patients , the density of neurofibrillary tangles related more closely to the dementia of DS than senile plaques . In people with DS surviving to middle and old age , the development of dementia of Alzheimer type is frequent but not inevitable , and some people with DS reach old age without clinical features of dementia CYP17 and CYP19 are involved in the peripheral synthesis of estrogens , and polymorphisms in CYP17 and CYP19 have been associated with increased risk of estrogen-related disorders . Women with Down syndrome ( DS ) have early onset and high risk for Alzheimer 's disease ( AD ) . We conducted a prospect i ve community-based cohort study to examine the relationship between SNPs in CYP17 and CYP19 and cumulative incidence of AD , hormone levels and sex hormone binding globulin in women with DS . Two hundred and thirty-five women with DS , 31 to 67 years of age and nondemented at initial examination , were assessed for cognitive and functional abilities , behavioral/psychiatric conditions , and health status at 14 - 20 month intervals over five assessment cycles . We genotyped these individuals for single-nucleotide polymorphisms ( SNPs ) in CYP17 and CYP19 . Four SNPs in CYP17 were associated with a two and one half-fold increased risk of AD , independent of APOE genotype . Four SNPs in CYP19 were associated with a two-fold increased risk of AD , although three were significant only in those without an APOE ε4 allele . Further , carrying high risk alleles in both CYP17 and CYP19 was associated with an almost four-fold increased risk of AD ( OR = 3.8 , 95 % CI , 1.6 - 9.5 ) and elevated sex hormone binding globulin in postmenopausal women . The main effect of the CYP17 and CYP19 variants was to decrease the age at onset . These findings suggest that genes contributing to estrogen bioavailability influence risk of AD in women with DS Research based on retrospective reports by carers suggests that the presentation of dementia in people with Down 's syndrome may differ from that typical of Alzheimer 's disease ( AD ) in the general population , with the earliest changes tending to be in personality or behaviour rather than in memory . This is the first long‐term prospect i ve study to test the hypothesis that such changes , which are more typical of dementia of frontal type ( DFT ) in the general population , mark the pre clinical stage of AD in DS BACKGROUND Recent research has suggested a specific impairment in frontal-lobe functioning in the pre clinical stages of Alzheimer 's disease ( AD ) in people with Down 's syndrome ( DS ) , characterised by prominent changes in personality or behaviour . The aim of the current paper is to explore whether particular kinds of change ( namely executive dysfunction ( EDF ) , disinhibition and apathy ) , associated in the literature with disruption of different underlying frontal-subcortical circuits , are a ) more or less frequently reported than others and b ) related to poor performance on tasks involving different cognitive processes . METHOD Seventy-eight participants ( mean age 47 years , range 36 - 72 ) with DS and mild to moderate intellectual disability ( based on ICD-10 criteria ) , without a diagnosis of dementia of Alzheimer 's type ( DAT ) or other psychiatric disorders , were selected from a larger sample of older adults with DS ( n = 122 ) . Dementia diagnosis was based on the CAMDEX informant interview , conducted with each participant 's main carer . Informant-reported changes in personality/behaviour and memory were recorded . Participants were scored based on symptoms falling into three behavioural domains and completed five executive function ( EF ) tasks , six memory tasks ( two of which also had a strong executive component ) and the BPVS ( as a measure of general intellectual ability ) . Multiple regression analyses were conducted to determine the degree to which the behavioural variables of ' EDF ' , ' disinhibition ' and ' apathy ' , along with informant-reported memory decline and antidepressant medication use , predicted performance on the cognitive tasks ( whilst controlling for the effects of age and general intellectual ability ) . RESULTS Strikingly , disinhibited behaviour was reported for 95.7 % of participants with one or more behavioural change ( n = 47 ) compared to 57.4 % with reported apathy and 36.2 % with reported EDF . ' Disinhibition ' score significantly predicted performance on three EF tasks ( design ed to measure planning , response inhibition and working memory ) and an object memory task , ( also thought to place high dem and s on working memory ) , while ' apathy ' score significantly predicted performance on two different tasks , those measuring spatial reversal and prospect i ve memory ( p < 0.05 ) . Informant reported memory decline was associated only with performance on a delayed recall task while antidepressant medication use was associated with better performance on a working memory task ( p < 0.05 ) . CONCLUSION Observed dissociation between performance on cognitive tasks associated with reported apathy and disinhibition is in keeping with proposed differences underlying neural circuitry and supports the involvement of multiple frontal-subcortical circuits in the early stages of DAT in DS . However , the prominence of disinhibition in the behavioural profile ( which more closely resembles that of disinhibited subtype of DFT than that of AD in the general population ) leads us to postulate that the serotonergically mediated orbitofrontal circuit may be disproportionately affected . A speculative theory is developed regarding the biological basis for observed changes and discussion is focused on how this underst and ing may aid us in the development of treatments directly targeting underlying abnormalities BACKGROUND Prevalence of Alzheimer 's disease in people with Down 's syndrome is very high , and many such individuals who are older than 40 years have pathological changes characteristic of Alzheimer 's disease . Evidence to support treatment with Alzheimer 's drugs is inadequate , although memantine is beneficial in transgenic mice . We aim ed to assess safety and efficacy of memantine on cognition and function in individuals with Down 's syndrome . METHODS In our prospect i ve r and omised double-blind trial , we enrolled adults ( > 40 years ) with karyotypic or clinical ly diagnosed Down 's syndrome , with and without dementia , at four learning disability centres in the UK and Norway . We r and omly allocated participants ( 1:1 ) to receive memantine or placebo for 52 weeks by use of a computer-generated sequence and a minimisation algorithm to ensure balanced allocation for five prognostic factors ( sex , dementia , age group , total Down 's syndrome attention , memory , and executive function scales [ DAMES ] score , and centre ) . The primary outcome was change in cognition and function , measured with DAMES scores and the adaptive behaviour scale ( ABS ) parts I and II . We analysed differences in DAMES and ABS scores between groups with analyses of covariance or quantile regression in all patients who completed the 52 week assessment and had available follow-up data . This study is registered , number IS RCT N47562898 . FINDINGS We r and omly allocated 88 patients to receive memantine ( 72 [ 82 % ] had DAMES data and 75 [ 85 % ] had ABS data at 52 weeks ) and 85 to receive placebo ( 74 [ 87 % ] and 73 [ 86 % ] ) . Both groups declined in cognition and function but rates did not differ between groups for any outcomes . After adjustment for baseline score , there were non-significant differences between groups of -4·1 ( 95 % CI -13·1 to 4·8 ) in DAMES scores , -8·5 ( -20·1 to 3·1 ) in ABS I scores , and 2·0 ( -7·2 to 11·3 ) in ABS II scores , all in favour of controls . 10 ( 11 % ) of 88 participants in the memantine group and six ( 7 % ) of 85 controls had serious adverse events ( p=0·33 ) . Five participants in the memantine group and four controls died from serious adverse events ( p=0·77 ) . INTERPRETATION There is a striking absence of evidence about pharmacological treatment of cognitive impairment and dementia in people older than 40 years with Down 's syndrome . Despite promising indications , memantine is not an effective treatment . Therapies that are effective for Alzheimer 's disease are not necessarily effective in this group of patients . FUNDING Lundbeck Objective / Background : To determine whether difficulty in the early differentiation between frontotemporal dementia ( FTD ) and AD may arise from a failure to discriminate between the temporal and frontal variants of FTD . Methods : Neuropsychological profiles of patients with early dementia of Alzheimer type ( DAT ; n = 10 ) , the temporal variant of FTD ( tv-FTD or semantic dementia ; n = 5 ) , and the frontal variant of FTD ( fv-FTD ; n = 10 ) were compared to each other and normal controls ( n = 10 ) . Structural MRI demonstrated temporal lobe atrophy in the tv-FTD patients and frontal lobe atrophy in the fv-FTD group . Results : Subjects with tv-FTD showed severe deficits in semantic memory with preservation of attention and executive function . Subjects with fv-FTD showed the reverse pattern . Attention and executive function impairment separated the fv-FTD patients from the early DAT subjects , who were densely amnesic . Conclusion : The double dissociation in performance on semantic memory and attention/executive function clearly separated the temporal and frontal variants of FTD and aids the early differentiation of FTD from AD . The characteristic cognitive profiles reflect the distribution of pathology within each syndrome and support the putative role of the inferolateral temporal neocortex in semantic memory , the medial temporal lobe structures of the hippocampal complex in episodic memory , and the frontal lobes in executive function The objective of this study was to determine the association of seizures and cognitive decline in adults with Down syndrome ( DS ) and Alzheimer's-type dementia . A retrospective data analysis was carried out following a controlled study of antioxidant supplementation for dementia in DS . Observations were made at baseline and every 6 months for 2 years . Seizure history was obtained from study records . The primary outcome measures comprised the performance-based Severe Impairment Battery ( SIB ) and Brief Praxis Test ( BPT ) . Secondary outcome measures comprised the informant-based Dementia Question naire for Mentally Retarded Persons and Vinel and Adaptive Behavior Scales . Because a large proportion of patients with seizures had such severe cognitive decline as to become untestable on the performance measures , time to " first inability to test " was measured . Adjustments were made for the potentially confounding co-variates of age , gender , APOE4 status , baseline cognitive impairment , years since dementia onset at baseline , and treatment assignment . The estimated odds ratio for the time to " first inability to test " on SIB comparing those with seizures to those without is 11.02 ( 95 % CI : 1.59 , 76.27 ) , a ratio that is significantly different from 1 ( p = 0.015 ) . Similarly , we estimated an odds ratio of 9.02 ( 95 % CI : 1.90 , 42.85 ) on BPT , a ratio also significantly different than 1 ( p = 0.006 ) . Results from a secondary analysis of the informant measures showed significant decline related to seizures . We conclude that there is a strong association of seizures with cognitive decline in demented individuals with DS . Prospect i ve studies exploring this relationship in DS are indicated
11,031
23,579,335
The benefit seen in the use of LLLT , although statistically significant , does not constitute the threshold of minimally important clinical difference
The aim of this study is to determine the efficacy of low-level laser therapy ( LLLT ) in reducing acute and chronic neck pain as measured by the visual analog scale ( VAS ) .
We present a double-blind trial in which a pulsed infrared beam was compared with a placebo in the treatment of myofascial pain in the cervical region . The patients were su bmi tted to 12 sessions on alternate days to a total energy dose of 5 J each . At each session , the four most painful muscular trigger points and five bilateral homometameric acupuncture points were irradiated . Those in the placebo group su bmi tted to the same number of sessions following an identical procedure , the only difference being that the laser apparatus was nonoperational . Pain was monitored using the Italian version of the McGill pain question naire and the Scott-Huskisson visual analogue scale . The results show a pain attenuation in the treated group and a statistically significant difference between the two groups of patients , both at the end of therapy and at the 3-month follow-up examination OBJECTIVE The objective of the study was to investigate clinical effects of low-level laser therapy ( LLLT ) in patients with acute neck pain with radiculopathy . DESIGN Double-blind , r and omized , placebo-controlled study . SETTING The study was carried out between January 2005 and September 2007 at the Clinic for Rehabilitation at the Medical School , University of Bel grade , Serbia . PATIENTS AND INTERVENTION Sixty subjects received a course of 15 treatments over 3 weeks with active or an inactivated laser as a placebo procedure . LLLT was applied to the skin projection at the anatomical site of the spinal segment involved with the following parameters : wavelength 905 nm , frequency 5,000 Hz , power density of 12 mW/cm(2 ) , and dose of 2 J/cm(2 ) , treatment time 120 seconds , at whole doses 12 J/cm(2 ) . OUTCOME MEASURES The primary outcome measure was pain intensity as measured by a visual analog scale . Secondary outcome measures were neck movement , neck disability index , and quality of life . Measurements were taken before treatment and at the end of the 3-week treatment period . RESULTS Statistically significant differences between groups were found for intensity of arm pain ( P = 0.003 , with high effect size d = 0.92 ) and for neck extension ( P = 0.003 with high effect size d = 0.94 ) . CONCLUSION LLLT gave more effective short-term relief of arm pain and increased range of neck extension in patients with acute neck pain with radiculopathy in comparison to the placebo procedure OBJECTIVE We aim ed to evaluate the effectiveness of laser therapy in myofascial pain syndrome treatment . BACKGROUND DATA Myofascial pain syndrome is a disease that is characterized by hypersensitive points called trigger points found in one or more muscles and /or connective tissues . It can cause pain , muscle spasm , sensitivity , stiffness , weakness , limitation of range of motion and rarely autonomic dysfunction . Physical therapy modalities and exercise are used in the treatment of this frequently encountered disease . METHODS The placebo controlled , prospect i ve , long-term follow up study was planned with 60 patients who had trigger points in their upper trapezius muscles . The patients were divided into three groups r and omly . Stretching exercises were taught to each group and they were asked to exercise at home . Treatment duration was 4 weeks . Placebo laser was applied to group 1 , dry needling to group 2 and laser to group 3 . He-Ne laser was applied to three trigger points in the upper trapezius muscles on both sides with 632.8 nm . The patients were assessed at before , post-treatment , and 6 months after-treatment for pain , cervical range of motion and functional status . RESULTS We observed a significant decrease in pain at rest , at activity , and increase in pain threshold in the laser group compared to other groups . Improvement according to Nottingham Health Profile gave the superiority of the laser treatment . However , those differences among the groups were not observed at 6-month follow up . CONCLUSIONS Laser therapy could be useful as a treatment modality in myofascial pain syndrome because of its noninvasiveness , ease , and short-term application Abstract A r and omized , double‐blind , placebo‐controlled study of low‐level laser therapy ( LLLT ) in 90 subjects with chronic neck pain was conducted with the aim of determining the efficacy of 300 mW , 830 nm laser in the management of chronic neck pain . Subjects were r and omized to receive a course of 14 treatments over 7 weeks with either active or sham laser to tender areas in the neck . The primary outcome measure was change in a 10 cm Visual Analogue Scale ( VAS ) for pain . Secondary outcome measures included Short‐Form 36 Quality ‐of‐Life question naire ( SF‐36 ) , Northwick Park Neck Pain Question naire ( NPNQ ) , Neck Pain and Disability Scale ( NPAD ) , the McGill Pain Question naire ( MPQ ) and Self‐Assessed Improvement ( SAI ) in pain measured by VAS . Measurements were taken at baseline , at the end of 7 weeks ' treatment and 12 weeks from baseline . The mean VAS pain scores improved by 2.7 in the treated group and worsened by 0.3 in the control group ( difference 3.0 , 95 % CI 3.8–2.1 ) . Significant improvements were seen in the active group compared to placebo for SF‐36‐Physical Score ( SF36 PCS ) , NPNQ , NPAD , MPQVAS and SAI . The results of the SF‐36 – Mental Score ( SF36 MCS ) and other MPQ component scores ( afferent and sensory ) did not differ significantly between the two groups . Low‐level laser therapy ( LLLT ) , at the parameters used in this study , was efficacious in providing pain relief for patients with chronic neck pain over a period of 3 months Low-energy laser therapy has been applied in several rheumatoid and soft tissue disorders with varying rates of success . The objective of our study was to investigate the effect of laser therapy on cervical myofascial pain syndrome with a placebo-controlled double-blind prospect i ve study model . It was performed with a total of 53 patients ( 35 females and 18 males ) with cervical myofascial pain syndrome . In group 1 ( n=23 ) , GaAs laser treatment was applied over three trigger points bilaterally and also one point in the taut b and s in trapezius muscle bilaterally with a frequency of 1000 Hz for 2 min over each point once a day for 10 days during a period of 2 weeks . In group 2 ( n=25 ) , the same treatment protocol was given , but the laser instrument was switched off during applications . All patients in both groups were instructed to perform daily isometric exercises and stretching just short of pain for 2 weeks at home . Evaluations were performed just before treatment ( week 0 ) , immediately after ( week 2 ) , and 12 weeks later ( week 14 ) . Evaluation parameters included pain , algometric measurements , and cervical lateral flexion . Statistical analysis was done on data collected from three evaluation stages . The results were evaluated in 48 patients ( 32 females , 16 males ) . Week 2 and week 14 results showed significant improvement in all parameters for both groups . However , comparison of the percentage changes both immediately and 12 weeks after treatment did not show a significant difference relative to pretreatment values . In conclusion , the results of our study have not shown the superiority of GaAs laser therapy over placebo in the treatment of cervical myofascial pain syndrome , but we suggest that further studies on this topic be done using different laser types and dosages in larger patient population & NA ; Pain intensity is frequently measured on an 11‐point pain intensity numerical rating scale ( PI‐NRS ) , where 0=no pain and 10=worst possible pain . However , it is difficult to interpret the clinical importance of changes from baseline on this scale ( such as a 1‐ or 2‐point change ) . To date , there are no data driven estimates for clinical ly important differences in pain intensity scales used for chronic pain studies . We have estimated a clinical ly important difference on this scale by relating it to global assessment s of change in multiple studies of chronic pain . Data on 2724 subjects from 10 recently completed placebo‐controlled clinical trials of pregabalin in diabetic neuropathy , postherpetic neuralgia , chronic low back pain , fibromyalgia , and osteoarthritis were used . The studies had similar design s and measurement instruments , including the PI‐NRS , collected in a daily diary , and the st and ard seven‐point patient global impression of change ( PGIC ) , collected at the endpoint . The changes in the PI‐NRS from baseline to the endpoint were compared to the PGIC for each subject . Categories of ‘ much improved ’ and ‘ very much improved ’ were used as determinants of a clinical ly important difference and the relationship to the PI‐NRS was explored using graphs , box plots , and sensitivity/specificity analyses . A consistent relationship between the change in PI‐NRS and the PGIC was demonstrated regardless of study , disease type , age , sex , study result , or treatment group . On average , a reduction of approximately two points or a reduction of approximately 30 % in the PI‐NRS represented a clinical ly important difference . The relationship between percent change and the PGIC was also consistent regardless of baseline pain , while higher baseline scores required larger raw changes to represent a clinical ly important difference . The application of these results to future studies may provide a st and ard definition of clinical ly important improvement in clinical trials of chronic pain therapies . Use of a st and ard outcome across chronic pain studies would greatly enhance the comparability , validity , and clinical applicability of these studies BACKGROUND AND OBJECTIVES A prospect i ve , double-blind , r and omized , and controlled trial was conducted in patients with chronic myofascial pain syndrome ( MPS ) in the neck to evaluate the effects of infrared low level 904 nm Gallium-Arsenide ( Ga-As ) laser therapy ( LLLT ) on clinical and quality of life ( QoL ) . STUDY DESIGN / PATIENTS AND METHODS The study group consisted of 60 MPS patients . Patients were r and omly assigned to two treatment groups : Group I ( actual laser ; 30 patients ) and Group II ( placebo laser ; 30 patients ) . LLLT continued daily for 2 weeks except weekends . Follow-up measures were evaluated at baseline , 2 , 3 , and 12 weeks . All patients were evaluated with respect to pain at rest , pain at movement , number of trigger points ( TP ) , the Neck Pain and Disability Visual Analog Scale ( NPAD ) , Beck depression Inventory ( BDI ) , and the Nottingham Health Profile ( NHP ) . RESULTS In active laser group , statistically significant improvements were detected in all outcome measures compared with baseline ( P < 0.01 ) while in the placebo laser group , significant improvements were detected in only pain score at rest at the 1 week later of the end of treatment . The score for self-assessed improvement of pain was significantly different between the active and placebo laser groups ( 63 vs. 19 % ) ( P < 0.01 ) . CONCLUSION This study revealed that short-period application of LLLT is effective in pain relief and in the improvement of functional ability and QoL in patients with MPS The efficacy of low-level laser therapy ( LLLT ) in myofascial pain syndrome ( MPS ) seems controversial . A prospect i ve , double-blind , r and omized controlled trial was conducted in patients with chronic MPS in the neck to evaluate the effects of low-level 830-nm gallium arsenide aluminum ( Ga – As – Al ) laser therapy . The study group consisted of 64 MPS patients . The patients were r and omly assigned into two groups . In group 1 ( n = 32 ) , Ga – As – Al laser treatment was applied over three trigger points bilaterally for 2 min over each point once a day for 15 days during a period of 3 weeks . In group 2 ( n = 32 ) , the same treatment protocol was given , but the laser instrument was switched off during applications . All patients in both groups performed daily isometric exercise and stretching exercises for cervical region . Parameters were measured at baseline and after 4 weeks . All patients were evaluated with respect to pain ( at rest , movement , and night ) and assessed by visual analog scale , measurement of active range of motion using an inclinometer and a goniometer , and the neck disability index . In both groups , statistically significant improvements were detected in all outcome measures compared with baseline ( p < 0.05 ) . However , no significant differences were obtained between the two groups ( p > 0.05 ) . In conclusion , although the laser therapy has no superiority over placebo groups in this study , we can not exclude the possibility of effectivity with another treatment regimen including different laser wavelengths and dosages ( different intensity and density and /or treatment interval ) Abstract : Pain is a major symptom in cervical osteoarthritis ( COA ) . Low-power laser ( LPL ) therapy has been cl aim ed to reduce pain in musculoskeletal pathologies , but there have been concerns about this point . The aim of this study was to evaluate the analgesic efficacy of LPL therapy and related functional changes in COA . Sixty patients between 20 and 65 years of age with clinical ly and radiologically diagnosed COA were included in the study . They were r and omised into two equal groups according to the therapies applied , either with LPL or placebo laser . Patients in each group were investigated blindly in terms of pain and pain-related physical findings , such as increased paravertebral muscle spasm , loss of lordosis and range of neck motion restriction before and after therapy . Functional improvements were also evaluated . Pain , paravertebral muscle spasm , lordosis angle , the range of neck motion and function were observed to improve significantly in the LPL group , but no improvement was found in the placebo group . LPL seems to be successful in relieving pain and improving function in osteoarthritic diseases OBJECTIVE Reliable and valid measures of pain are needed to advance research initiatives on appropriate and effective use of analgesia in the emergency department ( ED ) . The reliability of visual analog scale ( VAS ) scores has not been demonstrated in the acute setting where pain fluctuation might be greater than for chronic pain . The objective of the study was to assess the reliability of the VAS for measurement of acute pain . METHODS This was a prospect i ve convenience sample of adults with acute pain presenting to two EDs . Intraclass correlation coefficients ( ICCs ) with 95 % confidence intervals ( 95 % CIs ) and a Bl and -Altman analysis were used to assess reliability of paired VAS measurements obtained 1 minute apart every 30 minutes over two hours . RESULTS The summary ICC for all paired VAS scores was 0.97 [ 95 % CI = 0.96 to 0.98 ] . The Bl and -Altman analysis showed that 50 % of the paired measurements were within 2 mm of one another , 90 % were within 9 mm , and 95 % were within 16 mm . The paired measurements were more reproducible at the extremes of pain intensity than at moderate levels of pain . CONCLUSIONS Reliability of the VAS for acute pain measurement as assessed by the ICC appears to be high . Ninety percent of the pain ratings were reproducible within 9 mm . These data suggest that the VAS is sufficiently reliable to be used to assess acute pain
11,032
23,721,258
Pharmacokinetics , plasma renin and genetic polymorphisms did notwell predict the response of patients of African ancestry toantihypertensive drugs . Conclusion Available data are inconclusive regarding why patients of African ancestrydisplay the typical response to antihypertensive drugs . In lieu ofbiochemical or pharmacogenomic parameters , self-defined African ancestryseems the best available predictor of individual responses toantihypertensive drugs
Background Clinicians are encouraged to take an individualized approach when treatinghypertension in patients of African ancestry , but little is known about whythe individual patient may respond well to calcium blockers and diuretics , but generally has an attenuated response to drugs inhibiting therenin-angiotensin system and to β-adrenergic blockers . Therefore , we systematic ally review ed the factors associated with the differential drugresponse of patients of African ancestry to antihypertensive drugtherapy .
BACKGROUND African Americans have a disproportionate burden of hypertension and comorbid disease . Pharmacogenetic markers of blood pressure response have yet to be defined clearly . This study explores the association between G-protein-coupled receptor kinase type 4 ( GRK4 ) variants and blood pressure response to metoprolol among African Americans with early hypertensive nephrosclerosis . METHODS Participants from the African American Study of Kidney Disease and Hypertension ( AASK ) trial were genotyped at three GRK4 polymorphisms : R65L , A142V , and A486V . A Cox proportional hazards model , stratified by gender , was used to determine the relationship between GRK4 variants and time to reach a mean arterial pressure ( MAP ) of 107 mm Hg , adjusted for other predictors of blood pressure response . Potential interactions between the three polymorphisms were explored by analyzing the effects of gene haplotypes and by stratifying the analysis by neighboring sites . RESULTS The hazard ratio with 95 % confidence interval by A142V among men r and omized to a usual MAP ( 102 - 107 mm Hg ) was 1.54 ( 1.11 - 2.44 ; P = 0.0009 ) . The hazard ratio by A142V with R65/L65 or L65/L65 was 2.14 ( 1.35 - 3.39 ; P = 0.001 ) . Haplotype analyses were consistent but inconclusive . There was no association between A142V and blood pressure response among women . CONCLUSIONS Results suggest a sex-specific relationship between GRK4 A142V and blood pressure response among African-American men with early hypertensive nephrosclerosis . Men with a GRK4 A142 were less responsive to metoprolol if they had a GRK4 L65 variant . The effect of GRK4 variants and blood pressure response to metoprolol should be studied in larger clinical trials A single-blind , placebo-controlled study was conducted to compare the inhibitory effects of intravenous propranolol on exercise-induced tachycardia in eight matched pairs of healthy black and white volunteers , consisting of equal numbers of men and women . At the beginning of each test session , subjects were exercised on a bicycle ergometer -- the load predetermined to increase their heart rates to 160 - 190 beats/min . Exactly 30 min after having received an intravenous injection of placebo or propranolol ( the dose varying from 0.0125 to 0.15 mg/kg ) , they were subjected to the same exercise load . Heart rates were continuously recorded throughout exercise periods , and readings at 6 , 8 , and 10 min were compared for pre- and postinjection periods . A washout period of at least 3 days was allowed to elapse between sessions . Postinjection reductions of heart rate in blacks were compared with postinjection reductions in their white counterparts . Analysis of results indicated that the dose-response curve for the heart rate reduction result ing from propranolol for blacks was shifted to the right with respect to that for whites . In clinical terms , this means that to achieve the same degree of beta-blockade in blacks , a larger dose of propranolol is required than for whites , but that the same maximum is attainable In a double-blind , outcome trial conducted in hypertensive patients r and omized to chlorthalidone ( C ) , amlodipine ( A ) , lisinopril ( L ) , or doxazosin ( D ) , the α-adducin Gly460Trp polymorphism was typed ( n=36 913 ) . Mean follow-up was 4.9 years . Relative risks ( RRs ) of chlorthalidone versus other treatments were compared between genotypes ( Gly/Gly+Gly/Trp versus Trp/Trp ) . Primary outcome was coronary heart disease ( CHD ) . Coronary heart disease incidence did not differ among treatments or genotypes nor was there any interaction between treatment and genotype ( P=0.660 ) . Subgroup analyses indicated that Trp allele carriers had greater CHD risk with C versus A+L in women ( RR=1.31 ) but not men ( RR=0.91 ) with no RR gender differences for non-carriers ( gender – gene – treatment interaction , P=0.002 ) . The α-adducin gene is not an important modifier of antihypertensive treatment on cardiovascular risk , but women Trp allele carriers may have increased CHD risk if treated with C versus A or L. This must be confirmed to have implication s for hypertension treatment Purpose : To explore the association between CYP3A4 and CYP3A5 gene polymorphisms and blood pressure response to amlodipine among participants from the African-American Study of Kidney Disease and Hypertension Trial r and omized to amlodipine ( n = 164 ) . Methods : Cox proportional hazards models were used to determine the risk of reaching a target mean arterial pressure ( MAP ) of ≤107 mm Hg by CYP3A4 ( A–392 G and T16090C ) and CYP3A5 ( A6986 G ) gene polymorphisms , stratified by MAP r and omization group ( low or usual ) and controlling for other predictors for blood pressure response . Results : Women r and omized to a usual MAP goal with an A allele at CYP3A4 A–392 G were more likely to reach a target MAP of 107 mm Hg . The adjusted hazard ratio ( AA/AG compared to GG ) with 95 % confidence interval was 3.41 ( 1.20–9.64 ; p = 0.020 ) . Among participants r and omized to a lower MAP goal , those with the C allele at CYP3A4 T16090C were more likely to reach target MAP : The adjusted hazard ratio was 2.04 ( 1.17–3.56 ; p = 0.010 ) . After adjustment for multiple testing using a threshold significance level of p = 0.016 , only the CYP3A4 T16090C SNP remained significant . CYP3A5 A6986 G was not associated with blood pressure response . Conclusions : Our findings suggest that blood pressure response to amlodipine among high-risk African-Americans appears to be determined by CYP3A4 genotypes , and sex specificity may be an important consideration . Clinical applications of CYP3A4 genotype testing for individualized treatment regimens warrant further study OBJECTIVES The purpose of this study was to evaluate the efficacy and tolerability of monotherapy with the selective aldosterone blocker eplerenone in both black and white patients with hypertension . BACKGROUND Essential hypertension and cardiovascular-renal-target organ damage is more prevalent in black than white adults in the U.S. METHODS Black ( n = 348 ) and white ( n = 203 ) patients with mild-to-moderate hypertension were r and omized to double-blind treatment with eplerenone 50 mg , the angiotensin II receptor antagonist losartan 50 mg , or placebo once daily . Doses were increased if blood pressure remained uncontrolled . The primary end point was change in mean diastolic blood pressure ( DBP ) after 16 weeks of therapy . RESULTS Adjusted mean changes from baseline in DBP were -5.3 + /- 0.7 , -10.3 + /- 0.7 , and -6.9 + /- 0.6 mm Hg in the placebo , eplerenone-treated , and losartan-treated groups , respectively ( mean + /- SE , p < 0.001 eplerenone vs. placebo , p < 0.001 eplerenone vs. losartan ) . In black patients , DBP decreased by -4.8 + /- 1.0 , -10.2 + /- 0.9 , and -6.0 + /- 0.9 mm Hg for the placebo , eplerenone-treated , and losartan-treated groups , respectively ( mean + /- SE , p < 0.001 eplerenone vs. placebo , p < 0.001 eplerenone vs. losartan ) , whereas in white patients , DBP decreased by -6.4 + /- 1.0 , -11.1 + /- 1.1 , and -8.4 + /- 1.0 mm Hg , respectively ( p = 0.001 eplerenone vs. placebo , p = 0.068 for eplerenone vs. losartan ) . For reduction of systolic blood pressure ( SBP ) , eplerenone was superior to placebo and losartan in all patients combined and in black patients , and was superior to placebo in white patients . Eplerenone was as effective as losartan in reducing SBP and DBP in the high renin patient , but more effective than losartan in the low renin patient . Similarly , eplerenone was at least as effective as losartan in patients with differing baseline levels of aldosterone . Both eplerenone and losartan were well tolerated . CONCLUSIONS The antihypertensive effect of eplerenone was equal in black and white patients and was superior to losartan in black patients Background — We previously hypothesized that high activity of creatine kinase , the central regulatory enzyme of energy metabolism , facilitates the development of high blood pressure . Creatine kinase rapidly provides adenosine triphosphate to highly energy-dem and ing processes , including cardiovascular contraction , and antagonizes nitric oxide – mediated functions . Relatively high activity of the enzyme , particularly in resistance arteries , is thought to enhance pressor responses and increase blood pressure . Tissue creatine kinase activity is reported to be high in black people , a population subgroup with greater hypertension risk ; the proposed effects of high creatine kinase activity , however , are not “ race dependent . ” We therefore assessed whether creatine kinase is associated with blood pressure in a multiethnic population . Methods and Results — We analyzed a stratified r and om sample of the population of Amsterdam , the Netherl and s , consisting of 1444 citizens ( 503 white European , 292 South Asian , 580 black , and 69 of other ethnicity ) aged 34 to 60 years . We used linear regression analysis to investigate the association between blood pressure and normal serum creatine kinase after rest , as a substitute measure of tissue activity . Creatine kinase was independently associated with blood pressure , with an increase in systolic and diastolic pressure , respectively , of 8.0 ( 95 % CI , 3.3 to 12.7 ) and 4.7 ( 95 % CI , 1.9 to 7.5 ) mm Hg per log creatine kinase increase after adjustment for age , sex , body mass index , and ethnicity . Conclusions — Creatine kinase is associated with blood pressure . Further studies are needed to explore the nature of this association , including how variation in cardiovascular creatine kinase activity may affect pressor responses Objectives In the vast majority of cases the cause for hypertension is not known . On the basis of observations from black and multiethnic population s , it has been hypothesized that a genetically high tissue creatine kinase activity may be an independent factor responsible for primary hypertension . If the relation between creatine kinase and blood pressure is causal , it is reasonable to believe that it will be independent of ethnicity and present in different population s. In this cross-sectional study , we examined whether creatine kinase was associated with blood pressure in a large Caucasian normal population . Methods and results Data on creatine kinase and blood pressure were analyzed in a r and om sample of 12 776 men and women ( 65 % of those eligible ) , aged 30–87 years from a normal population in the municipality of Tromsø , Norway . We used linear regression to model the association between creatine kinase and blood pressure . Creatine kinase was independently associated with blood pressure . A one unit increase in log CK was associated with a 3.3 ( 95 % CI 1.4–5.2 ) mmHg increase in systolic blood pressure and a 1.3 ( 95 % CI 0.3–2.3 ) mmHg increase in diastolic blood pressure , after adjustment for age , sex , body mass index , s-glucose , s-creatinine , physical activity and alcohol consumption . The creatine kinase effect on blood pressure was independent of antihypertensive medication , and no difference in creatine kinase level was found between those with controlled and uncontrolled hypertension ( geometric mean 101 vs. 104 IU/l , P = 0.1 ) . Conclusion Creatine kinase was associated with blood pressure in this population BACKGROUND Age and race categories or renin profiling have been recommended to predict blood pressure responses to monotherapy with a beta-blocker or thiazide diuretic . Whether these or other characteristics predict blood pressure responses when the drugs are administered as add-on therapy is uncertain . METHODS We evaluated predictors of blood pressure response in 363 men and women < or = 65 years of age with primary hypertension ( 152 blacks , 211 whites ) , 86 of whom ( 24 % ) were untreated and 277 of whom ( 76 % ) were withdrawn from previous antihypertensive drugs before r and omization to either atenolol followed by addition of hydrochlorothiazide ( N = 180 ) or hydrochlorothiazide followed by addition of atenolol ( N = 183 ) . Responses were determined by home blood pressure averages before and after each drug administration . Race , age , plasma renin activity , and other characteristics including pretreatment blood pressure levels were incorporated into linear regression models to quantify their contributions to prediction of blood pressure responses . RESULTS Plasma renin activity and pretreatment blood pressure level consistently contributed to prediction of systolic and diastolic responses to each drug administered as mono- and as add-on therapy . Higher plasma renin activity was consistently associated with greater blood pressure responses to atenolol and lesser responses to hydrochlorothiazide . The predictive effects of plasma renin activity were statistically independent of race , age , and other characteristics . CONCLUSIONS Plasma renin activity and pretreatment blood pressure level predict blood pressure responses to atenolol and hydrochlorothiazide administered as mono- and as add-on therapy in men and women < or = 65 years of age Objective : To test the hypotheses that sodium kinetics are not affected by blood pressure , salt sensitivity , salt resistance or race , and that the kinetics of sodium balance are not a first-order process . Design , participants and interventions : Two studies were conducted . In the first , 18 normotensive and 36 hypertensive men and women were given sodium at 120 mmol/day for 6 days , followed by 10 mmol/day for 8 days , then 400 mmol/day for 8 more days . Salt sensitivity was defined as an increase in diastolic blood pressure from the 10 to the 400 mmol/day intake . Salt resistance was defined as no increase , or a decrease in diastolic blood pressure with the increased sodium intake . In the second study , 12 white and 12 black normotensive men ingested sodium at 10 , 200 or 400 mmol/day in r and om order , each for 7 days . All urine was collected in both protocol s. Setting : Metabolic ward at the University of Creifswald ( Greifswald , Germany ; study 1 ) , and Clinical Research Center ( Indiana University , Indianapolis , Indiana , USA ; study 2 ) . Main outcome measure : In addition to conventional statistics , a pharmacokinetic analysis was carried out to determine the elimination rate constant and half-life . Results : In the Creifwald study , when the sodium intake was decreased , a longer half-life was determined for the salt-sensitive than the salt-resistant hypertensive subjects . The half-life for the normotensive salt-sensitive and salt-resistant subjects did not differ . When the sodium intake was decreased , a monoexponential equation fitted the data for all subjects ; when the sodium intake was increased , only data for half the subjects could be fitted to the same equation . In the Indianapolis study , black race had a significant influence upon urinary sodium excretion . Furthermore , the half-life for sodium elimination was dependent upon sodium intake ; namely , the greater the intake , the longer the elimination half-life . Conclusions : The time required to reach sodium balance may increase following salt-sensitive increases in blood pressure rather than precede them . Race influences the time required to achieve salt balance . Sodium kinetics are not a first-order process Presentation , response to therapy , and clinical outcome in hypertension differ according to race , and these observations could relate to differences in microvascular function . We examined forearm microvascular function in age-matched black ( n=56 ) and white subjects ( n=62 ) using intra-arterial agonist infusion and venous occlusion plethysmography . In normotensive subjects ( n=70 ; 34 black and 36 white normotensives ) , methacholine- , sodium nitroprusside- , and verapamil-induced vasodilation was equivalent in black and white subjects . In hypertensive subjects ( n=48 ; 22 black and 26 white hypertensives ) , the vasodilator response to methacholine was markedly lower in black subjects compared with white subjects ( P < 0.001 ) . The vasodilator responses to sodium nitroprusside and verapamil , however , were equivalent in black and white hypertensive subjects . Acute ascorbic acid infusion improved the methacholine response equally in black and white hypertensive patients , suggesting that a difference in a rapidly reversible form of oxidative stress does not explain these findings . Thus , the present study demonstrates important racial differences in vascular function and a marked impairment in endothelial vasomotor function in black patients with hypertension . Further studies will be required to eluci date the mechanisms and determine whether these insights will lead to more appropriately tailored management of hypertension and its complications STUDY OBJECTIVES To evaluate whether variability in S-metoprolol kinetics and lymphocyte beta 2-receptor-mediated cyclic adenosine monophosphate ( cAMP ) accumulation is related to the variability in antihypertensive response to metoprolol of black men . DESIGN Prospect i ve , unblinded study . SETTING University-based preventive medicine clinic . PATIENTS Twelve hypertensive black men . MEASUREMENTS AND MAIN RESULTS Ambulatory blood pressure was measured over 24 hours before and after metoprolol administration . Ex vivo responsiveness of lymphocyte beta 2-receptors to isoproterenol was established for each subject before initiating metoprolol therapy . Plasma sample s were collected over 12 hours at the conclusion of the study , from which metoprolol enantiomer concentrations were determined by chiral high-performance liquid chromatography , and kinetic values were calculated . The 24-hour ambulatory blood pressure responses to metoprolol were highly variable , with systolic blood pressure responses ranging from -13 to + 33 mm Hg and diastolic blood pressure responses ranging from -15 to + 15 mm Hg . There was a significant relationship between the metoprolol-induced change in systolic blood pressure and the maximum lymphocyte beta 2-receptor cAMP production ( y = 0.47x-7.79 ; r2 = 0.49 , p < 0.05 ) such that those with the highest maximum cAMP production had the greatest blood pressure increases during metoprolol therapy . There was no relationship between S-metoprolol concentration and blood pressure response . Mean oral clearance values for S- and R-metoprolol were 1320 and 2346 ml/minute , respectively . CONCLUSIONS Lymphocyte beta 2-receptor data suggest that individuals most responsive to beta-receptor stimulation may be at greatest risk of blood pressure elevation during beta 2-receptor blockade . The metoprolol enantiomer kinetic data are markedly different from previously published data and may represent racial differences in pharmacokinetics Background : Failure of hypertension treatment is a major clinical issue because of the high prevalence and the associated mortality risk . We have reported evidence that creatine kinase increases blood pressure through greater sodium retention and cardiovascular contractility , by rapidly providing ATP for these functions . Therefore , we hypothesized that high creatine kinase is associated with failure of antihypertensive treatment . Method : We analyzed a cross-sectional , r and om multiethnic sample of the general population ( N = 1444 ) , aged 34–60 years . The primary outcome was the independent association between resting serum creatine kinase and treated uncontrolled hypertension in the population , using multinomial logistic regression analysis . Results : Hypertension prevalence was , respectively , 26.8 ; 30.8 ; and 41.2 % for the lowest ( < 88 IU/l ) through the highest population creatine kinase tertile ( > 145 IU/l ; P < 0.001 ) . Treatment failed in 72.9 % of participants within the highest creatine kinase tertile vs. 46.7 % within the lowest tertile ( P = 0.004 ) . In logistic regression analysis , creatine kinase was the main predictor of treatment failure ( adjusted odds ratio 3.7 ; 95 % confidence interval 1.2–10.9 ) , independent of age , sex , BMI , fasting glucose , ethnicity , or education level . Conclusion : Creatine kinase is associated with failure of antihypertensive therapy . Further investigations concerning this association might help improve treatment strategies for difficult-to-treat hypertension A recent genome-wide analysis discovered an association between a haplotype ( from rs317689/rs315135/rs7297610 ) on Chromosome 12q15 and blood pressure response to hydrochlorothiazide ( HCTZ ) in African – Americans . Our aim was to replicate this association and investigate possible functional mechanisms . We observed similar associations between this haplotype and HCTZ response in an independent sample of 746 Caucasians and African – Americans r and omized to HCTZ or atenolol treatment . The haplotype association was driven by variation at rs7297610 , where C/C genotypes were associated with greater mean ( systolic : 3.4 mmHg , P=0.0275 ; diastolic : 2.5 mmHg , P=0.0196 ) responses to HCTZ vs T-allele carriers . Such an association was absent in atenolol-treated participants , supporting this as HCTZ specific . Expression analyses in HCTZ-treated African – Americans showed differential pre-treatment leukocyte YEATS4 expression between rs7297610 genotype groups ( P=0.024 ) , and reduced post-treatment expression in C/C genotypes ( P=0.009 ) , but not in T-carriers . Our data confirm previous genome-wide findings at 12q15 and suggest differential YEATS4 expression could underpin rs7297610-associated HCTZ response variability , which may have future implication s for guiding thiazide treatment Background —Previous studies have reported that blood pressure response to antihypertensive medications is influenced by genetic variation in the renin-angiotensin-aldosterone system , but no clinical trails have tested whether the ACE insertion/deletion ( I/D ) polymorphism modifies the association between the type of medication and multiple cardiovascular and renal phenotypes . Methods and Results —We used a double-blind , active-controlled r and omized trial of antihypertensive treatment that included hypertensives ≥55 years of age with ≥1 risk factor for cardiovascular disease . ACE I/D genotypes were determined in 37 939 participants r and omized to chlorthalidone , amlodipine , lisinopril , or doxazosin treatments and followed up for 4 to 8 years . Primary outcomes included fatal coronary heart disease ( CHD ) and /or nonfatal myocardial infa rct ion . Secondary outcomes included stroke , all-cause mortality , combined CHD , and combined cardiovascular disease . Fatal and nonfatal CHD occurred in 3096 individuals during follow-up . The hazard rates for fatal and nonfatal CHD and the secondary outcomes were similar across antihypertensive treatments . ACE I/D genotype group was not associated with fatal and nonfatal CHD ( relative risk of DD versus ID and II , 0.99 ; 95 % CI , 0.91 to 1.07 ) or any secondary outcome . The 6-year hazard rate for fatal and nonfatal CHD in the DD genotype group was not statistically different from the ID and II genotype group by type of treatment . No secondary outcome measure was statistically different across antihypertensive treatment and ACE I/D genotype strata . Conclusions —ACE I/D genotype group was not a predictor of CHD , nor did it modify the response to antihypertensive treatment . We conclude that the ACE I/D polymorphism is not a useful marker to predict antihypertensive treatment response A r and omized controlled , single-blind trial was conducted to compare the effectiveness of a high-dose diuretic with a combination of a diuretic and metoprolol in black adults with hypertension . All subjects were first treated with 50 mg/d of hydrochlorothiazide for four weeks . Only subjects with a diastolic blood pressure of 95 mm Hg or higher at the end of this four-week period entered the r and omized trial . We hypothesized that black patients with uncontrolled hypertension and low plasma renin activity on usual-dose hydrochlorothiazide therapy ( ie , 50 mg/d ) would respond better to higher doses of hydrochlorothiazide ( ie , 100 to 150 mg/d ) than to a usual-dose diuretic and metoprolol . Diuretic-metoprolol combination therapy was significantly more effective than high-dose diuretic therapy regardless of plasma renin status Summary The beta-adrenoceptor blocking effects of pindolol were compared with those of a placebo in a double-blind trial in twelve hypertensive Africans . Heart rate and arterial blood pressure were measured at rest and immediately after exercise , before and at intervals up to 8 h after oral administration of the drugs . Plasma levels of pindolol were also determined . Pindolol reduced systolic blood pressure and antagonised exercised-induced tachycardia . The mean time to peak level of pindolol was 1.9 h and the mean half-life was 4.2 h. Comparison of plasma levels of pindolol and beta-adrenoceptor blocking activity showed good correlation between them . It is concluded that the pharmacokinetics and beta-blocking effects of pindolol in Africans are not dissimilar from published data for other races Previous studies have suggested that racial differences may exist in beta-adrenoceptor-mediated responsiveness . However , no clear conclusions can be drawn based on these studies because of the confounding effect of the parasympathetic nervous system on responses to isoproterenol bolus doses . In this study , we blocked the effects of the parasympathetic nervous system with atropine , to determine whether racial differences exist in sensitivity to beta-adrenoreceptor stimulation and blockade . Sixteen healthy black and white men participated in the study . Atropine was administered before all studies to induce parasympathetic blockade . Isoproterenol sensitivity studies and treadmill exercise were then performed in the with and without beta-adrenoceptor blockade by propranolol . Responses measured included heart rate ( HR ) , blood pressure ( BP ) , and tremor . The average isoproterenol dose producing a 25-beat/min increase in HR was more than twofold higher in blacks than in whites ( 3.4 + /- 1.2 vs. 1.6 + /- 0.4 micrograms , respectively , p < 0.05 ) . There were no racial differences in response to beta-adrenoreceptor blockade . Our results showed that during parasympathetic blockade blacks were less sensitive to the chronotropic effects of isoproterenol than whites . We conclude that these response differences are due to greater beta-adrenoceptor sensitivity in whites than in blacks The steady state bioavailability and pharmacokinetics of propranolol over two consecutive dosing intervals were investigated in 18 black and 10 white normal volunteers following the administration of 20 mg of a test and reference oral dosage form , respectively , every 6h . There were no differences ( p > 0.05 ) between dosage forms in the mean ( n = 28 ) area under the plasma concentration‐time curve ( AUC ) , maximum plasma concentration ( Gmax ) or time to Gmax ( tmax ) for propranolol or its active metabolite , 4‐hydroxypropranolol . However , as a group blacks had lower plasma concentrations of propranolol and 4‐hydroxypropranolol than whites . The mean AUC and Cmax for propranolol during the second dosing interval ( AUC‐2 and Cmax‐2 , respectively ) were significantly ( p < 0.05 ) lower in blacks , but there were no ethnic differences ( p > 0.05 ) in tmax . The mean AUC and Cmax for the 4‐hydroxylated metabolite during both dosing intervals were significantly ( p < 0.05 ) lower in blacks . Mean oral clearances of propranolol , assuming complete absorption , ( range : 42.1–54.5 ml min−1 kg−1 ) were similar ( p > 0.05 ) in each racial group . There were no substantial changes in heart rate or blood pressure in blacks or whites following propranolol administration . These data suggest that for oral propranolol , blacks have different absorption and disposition characteristics than whites The angiotensin-converting enzyme ( ACE ) inhibitor tr and olapril , a non-sulfhydryl prodrug which is hydrolysed into tr and olaprilat , was studied in 322 hypertensives of African – American descent using a double-blind , r and omised , placebo-controlled , parallel study design . Following 6 weeks of double-blind treatment with placebo or 0.25 to 16 mg/day tr and olapril , an analysis of drug effect on trough blood pressure ( BP ) stratified by age , gender , weight , pre-treatment plasma renin activity , and tr and olaprilat concentration was performed . Two mg was the lowest effective tr and olapril dose , whereas doses above 4 mg did not significantly reduce trough BP . Reduction in BP did not correlate with trough plasma tr and olaprilat concentration . Pre-treatment plasma renin activity was not a reliable indicator of antihypertensive response , as similar reductions in BP occurred even in patients with the lowest renin levels . There were no observable differences based on age , gender or measurements of the renin-angiotensin- aldosterone axis . In conclusion , neither age , gender or plasma renin activity influenced anti-hypertensive response to angiotensin-converting enzyme inhibition in African – Healthy young black men and white men received single intravenous doses of metoprolol ( 0.07 mg/kg ) or participated in an isoproterenol sensitivity study before and after metoprolol ( 0.07 mg/kg followed by 50 µg/min ) in a r and omized , crossed‐over fashion . Noncompartmental pharmacokinetic parameters were calculated . The dose of isoproterenol versus change in heart rate response curves were constructed , and comparisons of dose ratio , ED50 , Emax , and Ka , with the apparent association constant for metoprolol binding to β1receptors , were made . There were no pharmacokinetic differences observed between the groups . The predicted Emax for the black group was 52.7 ± 8.7 beats/mm at a metoprolol concentration of 29.8 ± 6.1 ng/ml , which was higher ( p < 0.05 ) than that in the white group , i.e. , 43.7 ± 7.3 beats/min at a concentration of 27.6 ± 9.1 ng/ml . There were no differences in dose ratio , ED50 , or Ka . The racial differences in β1‐receptor responses to exogenous isoproterenol following metoprolol can simply be explained by an increase in β1‐receptor activity in the black subjects , assuming homogeneity in cardiac β2‐receptor responses The antihypertensive effect of the angiotensin-converting enzyme inhibitor tr and olapril administered in doses of 1 , 2 , and 4 mg/d was compared in 207 white patients and 91 black patients with mild to moderate hypertension following a double-blind , r and omized , placebo-controlled , parallel study design . Tr and olapril is a prodrug that is rapidly hydrolyzed to its active diacid metabolite , tr and olaprilat . After 6 weeks of double-blind treatment , tr and olapril lowered baseline sitting diastolic pressure in both white and black patients . A comparison of the antihypertensive response of the two population s revealed that the black patients required between two and four times the dose of tr and olapril to obtain a response similar to that observed in the white patients . A dose of 1 mg/d tr and olapril result ed in a 6.1 mm Hg mean decrease in baseline sitting diastolic pressure for white patients ; a similar response ( -6.5 mm Hg ) was observed in the black patients at 4 mg/d . In contrast to the population differences in blood pressure , the decreases in angiotensin-converting enzyme activity were similar for both population s. An evaluation of tr and olaprilat levels revealed that there were no racial differences in the tr and olaprilat concentrations required to achieve a given degree of angiotensin-converting enzyme inhibition . Therefore , it appears that the antihypertensive response of black patients is not completely explained by a reduction in angiotensin-converting enzyme activity . The lack of response at a lower dose but increasing response at a higher dose could reflect another vasodepressor activity of tr and olapril or just be evidence of reduced sensitivity of high blood pressure in blacks to angiotensin-converting enzyme inhibition Dietary salt restriction is a recommended adjunct with antihypertensive therapy . There may be racial differences in blood pressure response to salt restriction while on antihypertensive therapy . We performed a multicenter , r and omized , double-blind , placebo-controlled , parallel-group clinical trial ( black , n=96 ; Hispanic , n=63 ; white , n=232 ) . Participants were initially preselected for stage I to III hypertension and then further selected for salt sensitivity ( > or = 5 mm Hg increase in diastolic blood pressure after 3 weeks of low salt [ < or = 88 mmol/d Na+ ] and high salt [ > 190 mmol/d Na+ ] diet ) . We compared the antihypertensive effect of an angiotensin-converting enzyme inhibitor ( enalapril 5 or 20 mg BID ) or a calcium channel antagonist ( isradipine 5 or 10 mg BID ) during alternating periods of high and low salt intake . The main outcome measure was blood pressure change and absolute blood pressure level achieved with therapy . During the high salt diet ( 314.7+/-107.5 mmol/d urinary Na+ ) there was greater downward change in blood pressure with both enalapril and isradipine compared with the low salt diet ( 90.1+/-50.8 mmol/d Na+ ) ; however , the absolute blood pressure achieved in all races was consistently lower on a low salt diet for both agents . Black , white , and Hispanic isradipine-treated salt-sensitive hypertensives demonstrated a smaller difference between high and low salt diets ( black , -3.6/-1.6 mmHg ; white , -6.2/-3.9 mmHg ; Hispanic , -8.1/-5.3 mm Hg ) than did enalapril-treated patients ( black , -9.0/-5.3 mm Hg ; white , -11.8/-7.0 mm Hg ; Hispanic , -11.1/-5.6 mm Hg ) . On the low salt diet , blacks , whites , and Hispanics had similar blood pressure control with enalapril and isradipine . On the high salt diet , blacks had better blood pressure control with isradipine than with enalapril , whereas there was no difference in the blood pressure control in whites and Hispanics treated with either drug . Dietary salt reduction helps reduce blood pressure in salt-sensitive hypertensive blacks , whites , and Hispanics treated with enalapril or isradipine . These data demonstrate that controlling for salt sensitivity diminishes race-related differences in antihypertensive activity CONTEXT Renin profiling and age-race subgroup may help select single-drug therapy for stage 1 and stage 2 hypertension . OBJECTIVE To compare the plasma renin profiling and age-race subgroup methods as predictors of response to single-drug therapy in men with stage 1 and 2 hypertension as defined by the Joint National Committee on Prevention , Detection , Evaluation and Treatment of High Blood Pressure . DESIGN The Veterans Affairs Cooperative Study on Single-Drug Therapy of Hypertension , a r and omized controlled trial . SETTING Fifteen Veterans Affairs hypertension centers . PATIENTS A total of 1105 ambulatory men with entry diastolic blood pressure ( DBP ) of 95 to 109 mm Hg , of whom 1031 had valid plasma and urine sample s for renin profiling . INTERVENTIONS R and omization to 1 of 6 antihypertensive drugs : hydrochlorothiazide , atenolol , captopril , clonidine , diltiazem ( sustained release ) , or prazosin . MAIN OUTCOME MEASURE Treatment response as assessed by percentage achieving goal DBP ( < 90 mm Hg ) in response to a single drug that corresponded to patients ' renin profile vs a single drug that corresponded to patients ' age-race subgroup . RESULTS Clonidine and diltiazem had consistent response rates regardless of renin profile ( 76 % , 67 % , and 80 % for low , medium , and high renin , respectively , for clonidine and 83 % , 82 % , and 83 % , respectively , for diltiazem for patients with baseline DBP of 95 - 99 mm Hg ) . Hydrochlorothiazide and prazosin were best in low- and medium-renin profiles ; captopril was best in medium- and high-renin profiles ( low- , medium- , and high-renin response rates were 82 % , 78 % , and 14 % , respectively , for hydrochlorothiazide ; 88 % , 67 % , and 40 % , respectively , for prazosin ; and 51 % , 83 % , and 100 % , respectively , for captopril for patients with baseline DBP of 95 - 99 mm Hg ) . Response rates for patients with baseline DBP of 95 to 99 mm Hg by age-race subgroup ranged from 70 % for clonidine to 90 % for prazosin for younger black men , from 50 % for captopril to 97 % for diltiazem for older black men , from 70 % for hydrochlorothiazide to 92 % for atenolol for younger white men , and from 84 % for hydrochlorothiazide to 95 % for diltiazem for older white men . Patients with a correct treatment for their renin profile but incorrect for age-race subgroup had a response rate of 58.7 % ; patients with an incorrect treatment for their renin profile but correct for age-race subgroup had a response rate of 63.1 % ( P = .30 ) . After controlling for DBP and interactions with treatment group , age-race subgroup ( P<.001 ) significantly predicted response to single-drug therapy , whereas renin profile was of borderline significance ( P= .05 ) . CONCLUSIONS In these men with stage 1 and stage 2 hypertension , therapeutic responses were consistent with baseline renin profile , but age-race subgroup was a better predictor of response Previous studies in white and mixed-race hypertensive patient population s have generally found patients with low renin activity more responsive to diuretic therapy than patients with normal renin activity . Twenty-nine black patients ( 26 women and three men ) with placebo diastolic blood pressure of 90 to 115 mm Hg were treated with spironolactone ( 100 to 400 mg/day ) and hydrochlorothiazide ( 100/mg/day ) . Renin status was categorized by ( 1 ) the intravenous furosemide test , ( 2 ) ambulation during placebo , and ( 3 ) ambulation during spironolactone and hydrochlorothiazide treatment . Only seven patients were categorized identically with all methods . No method identified a low renin subgroup that was more responsive to either spironolactone or hydrochlorothiazide . Diastolic blood pressure fall with hydrochlorothiazide ( 18 mm Hg ) and 400 mg/day of spironolactone ( 15 mm Hg ) was similar . Thus , since black women with both low and normal renin activity are quite responsive to diuretics , renin classification to guide initial antihypertensive selection is not warranted In a r and omized double-blind study , we compared the short-term effects of nifedipine ( 10 mg 3x daily for 1 day ) versus placebo on 24-h blood pressure , diuresis , natriuresis , urinary excretion of dopamine and metabolites , and on plasma renin activity ( PRA ) and plasma aldosterone levels in 18 black hypertensive ( HT ) patients [ eight salt-resistant ( HT-SR ) and 10 salt-sensitive ( HT-SS ) ] , and in 20 black normotensive ( NT ) subjects ( 12 NT-SR and eight NT-SS ) who were studied r and omly with both a high- ( HS ) and a low-salt ( LS ) diet . In comparison to placebo , nifedipine significantly decreased 24-h mean BP in all groups either with HS or LS diets ( all p<0.05 ) . With HS , greater hypotensive effects were achieved in NT-SS ( -10+/-2 mm Hg ) versus NT-SR ( -3+/-1 mm Hg ; p<0.05 ) and in HT-SS ( -18+/-2 mm Hg ) versus HT-SR ( -12+/-2 mm Hg ; p<0.05 ) . In NT-SS and HT-SS , nifedipine induced greater ( p<0.05 ) BP decrease with HS ( -10+/-2 and -18+/-2 mm Hg ) than with LS ( -4+/-1 and -9+/-1 mm Hg , respectively ) , whereas in NT-SR and HT-SR , the hypotensive effect did not differ between HS and LS . Nifedipine versus placebo significantly increased natriuresis and fractional excretion of sodium in all groups only with HS ( p<0.05 ) but not with LS diets . Only in HT-SS were the hypotensive and natriuretic effects of nifedipine significantly correlated ( r = -0.77 ; p<0.01 ) . Nifedipine produced a similar increase of the urinary excretion of dopamine , L-DOPA , and of DOPAC in all subjects , which did not correlate with hypotensive and natriuretic effects . Nifedipine did not modify plasma levels of renin and of aldosterone except in NT-SS with HS , in whom nifedipine increased PRA levels ( p < 0.05 ) . We conclude that although nifedipine reduces BP in all groups of NT and HT with LS and HS diets , the effect is greater in salt-sensitive subjects with HS . Although in HT-SS with HS , the short-term natriuretic response to nifedipine may contribute to its hypotensive effects , the diuretic-natriuretic effect of nifedipine is not necessary for the expression of its hypotensive effect . Moreover , it is unlikely that any short-term effects of nifedipine either on the renal dopaminergic system or on the secretion of aldosterone explain nifedipine short-term hypotensive and diuretic-natriuretic effects In this study , the relation between renin activity and therapeutic response to hydrochlorothiazide or propranolol was studied . Patients with a diastolic blood pressure of 95 to 114 mm Hg were treated with propranolol ( 40 to 320 mg twice daily ) or hydrochlorothiazide ( 25 to 100 mg twice daily ) . The initial renin profiles were : low , 56 percent ( n = 300 ) ; normal , 33 percent ( n = 174 ) ; high , 11 percent ( n = 60 ) . A greater incidence of low and fewer high renin profiles ( p less than 0.001 ) were observed in blacks . After furosemide administration ( 40 mg intravenously ) , 55 percent of patients ( n = 291 ) had a low renin response and 45 percent ( n = 240 ) had a normal renin response . No correlation between renin profile and renin response was observed , although low renin response and low renin profile occurred more frequently in older patients . Hydrochlorothiazide administration result ed in a greater decrement in diastolic blood pressure ( p less than 0.05 ) in the total group . Irrespective of renin activity , both hydrochlorothiazide and propranolol reduced diastolic blood pressure . When renin profile was considered , no significant variation in response to hydrochlorothiazide therapy was observed , and there was a greater reduction in diastolic blood pressure in the patients with a high renin profile receiving propranolol . In comparing therapeutic response , patients with a low renin profile had a better response to hydrochlorothiazide , and propranolol was more effective in patients with a high renin profile . The anticipated effect of therapy on plasma renin activity was observed . Although these results are consistent with a volume-vasoconstrictor analysis of hypertension , the results of therapy could not have been prejudged from renin profile or responsivity . The slight differences observed do not warrant the expense of renin determinations when a simple determination of therapeutic response is sufficient Since salt intake may affect blood pressure response to antihypertensive drugs , an individual 's salt-sensitivity status may be an important consideration in the selection of a medication . The purpose of this single-blind study was to assess the impact of salt sensitivity on the antihypertensive effects of isradipine . A total of 21 evaluable hypertensive patients ( 10 white , 11 black ) 35 to 73 years of age ( mean 55.9 years ) were r and omized to a low-salt diet ( mean 24-hour urine sodium 100+/-14 mmol ) or a high-salt diet ( mean 24-hour urine sodium 210+/-22 mmol ) for 7 weeks , followed by crossover to the other diet after a 2-week washout period . On each diet regimen , patients received placebo for 2 weeks , followed by optimal titration of isradipine ( 2.5 to 10 mg BID ) for blood pressure control during the last 5 weeks . On the high-salt diet , salt-sensitive hypertensives ( mean arterial blood pressure increase > or = 5 mm Hg , n=5 ) exhibited a systolic/diastolic blood pressure change of -18.7/-19.6 mm Hg from 157.2/102.9 mm Hg after 5 weeks of isradipine treatment , whereas on a low-salt diet , blood pressure change was -6.9/-12.0 mm Hg from 148.7/97.3 mm Hg . Non-salt-sensitive patients ( n=16 ) exhibited a systolic/diastolic blood pressure change of -12.6/-7.6 mm Hg from 155.3/98.6 mm Hg on the high-salt diet and -19.2/-10.9 mm Hg from 161.0/102.6 mm Hg on the low-salt diet after treatment with isradipine . The absolute blood pressure attained in both salt-sensitive and non-salt-sensitive patients was almost identical with isradipine therapy despite variation in dietary salt , although slightly higher doses of isradipine were required in the salt-sensitive group . Consequently , isradipine , and perhaps calcium antagonists in general , manifests a more robust blood pressure-lowering effect in the setting of high sodium intake . This effect does , however , appear to be largely confined to individuals who are salt sensitive Objective It has yet to be determined whether genotyping at the angiotensin-converting enzyme ( ACE ) locus is predictive of blood pressure response to an ACE inhibitor . Methods Participants from the African American Study of Kidney Disease and Hypertension trial r and omized to the ACE inhibitor ramipril ( n = 347 ) were genotyped at three polymorphisms on ACE , just downstream from the ACE insertion/deletion polymorphism ( Ins/Del ) : G12269A , C17888 T , and G20037A . Time to reach target mean arterial pressure ( ≤107 mmHg ) was analyzed by genotype and ACE haplotype using Kaplan – Meier survival curves and Cox proportional hazard models . Results Individuals with a homozygous genotype at G12269A responded significantly faster than those with a heterozygous genotype ; the adjusted ( average number of medications and baseline mean arterial pressure ) hazard ratio ( homozygous compared to heterozygous genotype ) was 1.86 ( 95 % confidence limits 1.32–3.23 ; P < 0.001 for G12269A genotype ) . The adjusted hazard ratio for participants with homozygous ACE haplotypes compared to those heterozygous ACE haplotypes was 1.40 ( 1.13–1.75 ; P = 0.003 for haplotype ) . The ACE genotype effects were specific for ACE inhibition ( i.e. , not seen among those r and omized to a calcium channel blocker ) , and were independent of population stratification . Conclusions African-Americans with a homozygous genotype at G12269A or homozygous ACE haplotypes responded to ramipril significantly faster than those with a heterozygous genotype or heterozygous haplotypes , suggesting that heterosis may be an important determinant of responsiveness to an ACE inhibitor . These associations may be a result of biological activity of this polymorphism , or of linkage disequilibrium with nearby variants such as the ACE Ins/Del , perhaps in the regulation of ACE splicing In this r and omised double-blind parallel study , we compared the efficacy of labetalol and atenolol in a group of black ( n = 33 ) and white ( n = 34 ) hypertensives with uncomplicated essential hypertension after obtaining pretreatment renin profiles . After single-blind placebo ( 14 - 21 days ) , patients with st and ing diastolic BP between 105 - 119 mmHg were r and omised to receive either labetalol ( 100 - 800 mg twice daily ) or atenolol ( 50 - 100 mg once daily ) to achieve a DBP less than 90 mmHg . Dosage titration occurred at weekly intervals for labetalol and biweekly for atenolol . The supine BP decrease with atenolol was -18/-14 vs. -6/-6 mmHg in whites vs. blacks respectively . With labetalol , it was -13/-12 in whites and -2/-7 mmHg in blacks . St and ing BPs were : -19/-14 vs. -4/-5 , whites vs. blacks with atenolol and -17/-17 vs. -19/-9 mmHg with labetalol . Neither labetalol nor atenolol was as effective in black compared with white hypertensives . The atenolol but not labetalol BP response was positively correlated with pretreatment renin values Furosemide and hydrochlorothiazide were compared for treatment of black patients with mild to moderate hypertension in a r and omized , open-label , crossover study design . Hydrochlorothiazide produced a significantly greater fall in mean arterial ( 24.7 vs 16.0 mm Hg , P less than .01 ) and diastolic ( 17.3 vs 10.1 mm Hg , P less than .01 ) blood pressure ( BP ) in 16 patients . Addition of methyldopa in nine patients produced a significantly greater fall in mean arterial ( 38.8 vs 31.9 mm Hg , P less than .05 ) and diastolic ( 28.9 vs 23.4 mm Hg , P less than .05 ) BP with hydrochlorothiazide vs furosemide . Renin status was categorized before and after treatment . Patients with low and normal renin activity were equally responsive to both diuretics . Hydrochlorothiazide caused a greater reduction in plasma potassium ( 0.26 mEg/L ) . Serum parathyroid hormone was not chronically elevated with furosemide . In this study , hydrochlorothiazide was more effective than furosemide for treatment of mild to moderate hypertension in black patients ; renin classification did not predict diuretic responsiveness Although blacks have lower plasma renin activity compared to whites , the corresponding differences in serum angiotensin converting enzyme ( ACE ) levels have not been well studied . Furthermore , few studies have examined the relationship of renin activity and ACE levels to blood pressure ( BP ) in blacks . We addressed these questions in a cross-sectional study conducted in 110 blacks and 183 whites who were not on antihypertensive medications . Three BP readings were obtained during a clinic visit . Plasma renin activity was assayed by radioimmunoassay and serum ACE levels were measured by spectrophotometry . Mean systolic and diastolic BP were 122.6 and 77.9 mm Hg in the blacks , and 123.4 and 77.9 mm Hg in the whites , respectively . Plasma renin activity was significantly lower in the blacks compared to the whites ( 0.92 v 1.26 ng/mL/h , respectively , P < .05 ) , but ACE levels were similar in both groups ( 28.8 v 29.6 U/L , respectively ) . Renin activity was significantly and inversely associated with systolic and diastolic BP in both the blacks and the whites . ACE levels , however , were inversely associated with BP in the blacks but positively associated with BP in the whites ( P = .02 for interaction on diastolic BP ) , even after adjustment for age , gender , body mass index ( BMI ) , alcohol consumption , and heart rate . The corresponding interaction between ACE level and race on systolic BP was of borderline significance ( P = .06 ) . These results suggest that levels of ACE are similar in blacks and whites but their association with BP is possibly reflecting underlying ethnic differences in regulation of BP The possible racial differences in alpha 1-adrenoceptor responsiveness and the blood pressure and heart rate responses following alpha 1-adrenoceptor antagonism with trimazosin have been investigated in matched groups of six Caucasians and six Nigerians . There were no significant differences between the racial groups in the blood pressure and heart rate responses to oral ( 200 mg ) and intravenous ( 100 mg ) trimazosin . alpha 1-adrenoceptor responsiveness was similar in both groups after placebo and following both active treatments . There were only minor pharmacokinetic differences with the Caucasians having a larger volume of distribution , and a longer terminal elimination half-life for the metabolite , 1-hydroxy-trimazosin . These results suggest a similarity in peripheral vascular alpha 1-adrenoceptor mechanisms and show no major significant racial differences in the pharmacokinetics and pharmacodynamics of trimazosin BACKGROUND This study focuses on the relationship between β(1)-adrenergic receptor ( ADRB1 ) polymorphisms and blood pressure response to the β-blocker metoprolol among African Americans with early hypertensive nephrosclerosis . METHODS Participants from the African-American Study of Kidney Disease and Hypertension ( AASK ) trial were genotyped for ADRB1 polymorphisms : Ser49Gly and Arg389Gly . Cox proportional hazards models were used to determine the relationship between ADRB1 polymorphisms and time to reach a mean arterial pressure ( MAP ) of ≤107 mm Hg in the first year after r and omization , adjusted for other predictors of blood pressure response . RESULTS In the Ser49Gly model , Ser49/Gly49 individuals were less responsive compared to Ser49/Ser49 only among the more obese ( body mass index ( BMI ) ≥39 kg/m(2 ) ) participants ( P < 0.05 for genotype × BMI interaction ) . The hazard ratio ( HR ) with a BMI of 39 kg/m(2 ) was 0.68 ( 95 % confidence interval ( CI ) 0.46 - 0.99 ) . In the Arg389Gly model , participants with Arg389 were less likely to respond to metoprolol : HR : 0.68 ( 95 % CI 0.50 - 0.93 ) . In addition , women were less responsive to metoprolol compared to men : HR : 0.78 ( 95 % CI 0.60 - 0.995 ) . CONCLUSIONS Ser49/Gly49 was predictive of blood pressure response to metoprolol only among more obese African Americans with early hypertensive nephrosclerosis . In contrast to other studies suggesting increased short-term responsiveness to β-blockers with Arg389 , Arg389 individuals were less responsive in this study analyzing blood pressure over a 1-year period . This may be partly explained by decreased agonist-promoted desensitization with Arg389 . However , gender , physiological adaption to stress , interactions between genes and between genes and the environment , as well as study in other patient population s need to be considered Objective : We have shown previously that the combination of captopril and nifedipine was effective at peak response but was very short acting . We therefore decided to study the longer-acting angiotensin converting enzyme inhibitor lisinopril and the long-acting calcium antagonist amlodipine , each alone and in combination , in a double-blind , r and omized crossover study in which blood pressures were measured at peak and trough . This study provided the opportunity to investigate what parameters in these patients might possibly predict the fall in blood pressure with the individual drugs and with the combination Methods : Fifteen patients with essential hypertension ( eight male , 10 Caucasian ; mean age 53 years ) were studied . After 1 month observation on no treatment they were entered into a single-blind run-in of placebo given once a day for 1 month . Patients were then allocated r and omly to amlodipine ( 5 mg once a day ) , lisinopril ( 10 mg once a day ) or their combination ( once a day ) for 1 month in a double-blind crossover study . All patients were studied on their usual diet and no dietary advice was given . Blood pressure was measured by semi-automatic ultrasound sphygmomanometer both 24 h and 6 h ( trough and peak ) after the last dose Results : During the crossover part of the study there was a significant additional blood pressure-lowering effect ( at trough ) of the combination compared with either amlodipine or lisinopril alone . Similar results were observed for the blood pressures at peak . The fall in blood pressure with lisinopril was related to baseline plasma renin activity , whereas when amlodipine was given , either alone or in combination , the fall in blood pressure was independent of baseline renin activity . The Blacks ( n=5 ) appeared not to respond as well to lisinopril as the Caucasians ( n=10 ) . Finally , the blood pressure response to amlodipine tended to be associated with the severity of hypertension Conclusions : The results of the present study indicate that : amlodipine and lisinopril in combination have a marked additional effect on blood pressure compared with either given as a monotherapy ; their potentiation of action is long-acting ; Black patients tend not to respond to the monotherapy with lisinopril as well as Caucasian patients , although they respond similarly to the combination ; the response to amlodipine tends to be greater the higher the initial blood pressure ; and , finally , the response to lisinopril is greater the higher the plasma renin BACKGROUND Vasodilator reactivity is attenuated in normotensive blacks , and this may contribute to their enhanced susceptibility to hypertension and its complications . However , the mechanisms responsible for this phenomenon are unknown . We therefore studied nitric oxide (NO)-dependent and -independent vasorelaxation in healthy blacks and whites to investigate the nature of racial differences in vasodilator function . METHODS AND RESULTS Forearm flow responses to intra-arterial infusion of increasing doses of acetylcholine ( a vasodilator that stimulates endothelial release of NO ) , sodium nitroprusside ( an exogenous NO donor ) , and isoproterenol ( a beta-adrenergic agonist whose vasodilator effect stems from the combination of direct smooth muscle stimulation and endothelial NO release ) were studied in 18 normotensive whites and 18 blacks by use of strain-gauge plethysmography . A blunted vasodilator response to acetylcholine ( 7.2+/-1.1 versus 14.4+/-1.8 mL. min-1 . dL-1 ; P<0.001 ) and sodium nitroprusside ( 8.2+/-1.1 versus 12.1+/-1.3 mL. min-1 . dL-1 ; P<0.001 ) was observed in blacks compared with whites , suggesting decreased cGMP-mediated smooth muscle relaxation . The vasodilator effect of isoproterenol was lower in blacks than in whites both before ( 10.9+/-1.7 versus 14.9+/-1.5 mL. min-1 . dL-1 ; P=0.006 ) and after NG-monomethyl-L-arginine ( 6.1+/-1.2 versus 10 . 1+/-0.8 mL. min-1 . dL-1 ; P<0.001 ) , implying that cAMP-dependent vasodilator response to isoproterenol is diminished in blacks . No significant difference was observed in the hyperemic response to forearm ischemia . CONCLUSIONS Compared with whites , healthy blacks have reduced vasodilation in response to NO-dependent and -independent stimuli . This difference seems to be related to an attenuation in cyclic nucleotide-mediated vascular smooth muscle relaxation and may play a role in the increased prevalence of hypertension and its complications in blacks The T allele of the C825 T polymorphism of the gene encoding the & bgr;3-subunit of G proteins has been associated with increased sodium-hydrogen exchange and low renin in patients with essential hypertension . To assess its association with blood pressure response to diuretic therapy , we measured the C825 T polymorphism in 197 blacks ( 134 men , 63 women ) and 190 non-Hispanic whites ( 76 men , 114 women ) with essential hypertension ( mean±SD age 48±7 years ) , who underwent monotherapy with hydrochlorothiazide for 4 weeks . Mean declines in systolic and diastolic blood pressures were 6±2 ( P < 0.001 ) and 5±1 ( P < 0.001 ) mm Hg greater , respectively , in TT than in CC homozygotes . Responses in heterozygotes were intermediate between the homozygous groups . Other univariate predictors of greater blood pressure responses included black race , female gender , higher pretreatment blood pressure , older age , lower waist-to-hip ratio , and measures of lower renin-angiotensin-aldosterone system activity . After the effects of the other predictors were considered , the TT genotype remained a significant predictor of greater declines in systolic and diastolic blood pressures . Thus , the C825 T polymorphism of the G protein & bgr;3-subunit may help identify patients with essential hypertension who are more responsive to diuretic therapy Nifedipine was administered to 12 healthy Nigerian volunteers as a single oral dose of 20 mg capsule under fasting conditions . The pharmacokinetic results were compared with published data using the same protocol and analytical method for 27 Caucasians and 30 South Asians . The area under the plasma concentration-time curve ( AUC ) of nifedipine in Nigerians ( 808 + /- 250 ng ml-1 h ) was significantly higher ( P < 0.001 ) than that in Caucasians ( 323 + /- 116 ng ml-1 h ) and the difference remained significant ( P < 0.001 ) when corrected for body weight . The elimination half-life was also significantly higher ( P < 0.01 ) in Nigerians ( 5.03 + /- 1.96 h ) than in Caucasians ( 2.78 + /- 1.11 h ) . No significant differences were observed between Nigerians and South Asians in either AUC or half-life of nifedipine . The AUC of the nitropyridine metabolite was higher ( P < 0.01 ) in Nigerians ( 220 + /- 51 ng ml-1 h ) compared with that in Caucasians ( 154 + /- 56 ng ml-1 h ) but the difference was not maintained when corrected for body weight . The AUC corrected for body weight and the elimination half-life of the metabolite were significantly higher in South Asians compared with those of Nigerians and Caucasians . The pharmacokinetics of oral nifedipine in Nigerians were similar to those in South Asians and therefore may also arise from a lower systemic clearance compared with Caucasians as has been reported previously for South Asians
11,033
23,454,259
However , no significant differences were observed in the survival to the hospital discharge , favorable neurological outcome at hospital discharge , and rearrest . This review demonstrates that prehospital therapeutic hypothermia after cardiac arrest can decrease temperature on hospital admission . On the other h and , regarding the survival to hospital discharge , favorable neurological outcome at hospital discharge , and rearrest , our meta- analysis and review produces non-significant results .
BACKGROUND Therapeutic hypothermia has been recommended for the treatment of cardiac arrest patients who remain comatose after the return of spontaneous circulation . However , the optimal time to initiate therapeutic hypothermia remains unclear . The objective of the present study is to assess the effectiveness and safety of prehospital therapeutic hypothermia after cardiac arrest .
Background —Recent clinical studies have demonstrated that hypothermia to 32 ° to 34 ° C provides significant clinical benefit when induced after resuscitation from cardiac arrest . However , cooling during the postresuscitation period was slow , requiring 4 to 8 hours to achieve target temperatures after return of spontaneous circulation ( ROSC ) . Whether more rapid cooling would further improve survival remains unclear . We sought to determine whether cooling during cardiac arrest before ROSC ( ie , “ intra-arrest ” hypothermia ) has survival benefit over more delayed post-ROSC cooling , using a murine cardiac arrest model . Methods and Results —A model of potassium-induced cardiac arrest was established in C57BL/6 mice . After 8 minutes of untreated cardiac arrest , resuscitation was attempted with chest compression , ventilation , and intravenous fluid . Mice were r and omized to 3 treatment groups ( n=10 each ) : an intra-arrest hypothermia group , in which mice were cooled to 30 ° C just before attempted resuscitation , and then rewarmed after 1 hour ; a post-ROSC hypothermia group , in which mice were kept at 37 ° C for 20 minutes after successful ROSC and then were cooled to 30 ° C for 1 hour ; and a normothermic control group , in which mice were kept at 37 ° C . The intra-arrest hypothermia group demonstrated better 72-hour survival than delayed hypothermia and normothermia groups ( 6/10 versus 1/10 and 1/10 survivors , respectively , P < 0.05 ) , with similar differences seen at 6-hour survival and on neurological scoring . Conclusions —Timing of hypothermia is a crucial determinant of survival in the murine arrest model . Early intra-arrest cooling appears to be significantly better than delayed post-ROSC cooling or normothermic resuscitation Background — Therapeutic hypothermia is recommended for the treatment of neurological injury after resuscitation from out-of-hospital cardiac arrest . Laboratory studies have suggested that earlier cooling may be associated with improved neurological outcomes . We hypothesized that induction of therapeutic hypothermia by paramedics before hospital arrival would improve outcome . Methods and Results — In a prospect i ve , r and omized controlled trial , we assigned adults who had been resuscitated from out-of-hospital cardiac arrest with an initial cardiac rhythm of ventricular fibrillation to either prehospital cooling with a rapid infusion of 2 L of ice-cold lactated Ringer 's solution or cooling after hospital admission . The primary outcome measure was functional status at hospital discharge , with a favorable outcome defined as discharge either to home or to a rehabilitation facility . A total of 234 patients were r and omly assigned to either paramedic cooling ( 118 patients ) or hospital cooling ( 116 patients ) . Patients allocated to paramedic cooling received a median of 1900 mL ( first quartile 1000 mL , third quartile 2000 mL ) of ice-cold fluid . This result ed in a mean decrease in core temperature of 0.8 ° C ( P=0.01 ) . In the paramedic-cooled group , 47.5 % patients had a favorable outcome at hospital discharge compared with 52.6 % in the hospital-cooled group ( risk ratio 0.90 , 95 % confidence interval 0.70 to 1.17 , P=0.43 ) . Conclusions — In adults who have been resuscitated from out-of-hospital cardiac arrest with an initial cardiac rhythm of ventricular fibrillation , paramedic cooling with a rapid infusion of large-volume , ice-cold intravenous fluid decreased core temperature at hospital arrival but was not shown to improve outcome at hospital discharge compared with cooling commenced in the hospital . Clinical Trial Registration — URL : http://www.anzctr.org.au . Unique identifier : ACTRN12605000179639 Background — Although delayed hospital cooling has been demonstrated to improve outcome after cardiac arrest , in-field cooling started immediately after the return of spontaneous circulation may be more beneficial . The aims of the present pilot study were to assess the feasibility , safety , and effectiveness of in-field cooling . Methods and Results — We determined the effect on esophageal temperature , before hospital arrival , of infusing up to 2 L of 4 ° C normal saline as soon as possible after resuscitation from out-of-hospital cardiac arrest . A total of 125 such patients were r and omized to receive st and ard care with or without intravenous cooling . Of the 63 patients r and omized to cooling , 49 ( 78 % ) received an infusion of 500 to 2000 mL of 4 ° C normal saline before hospital arrival . These 63 patients experienced a mean temperature decrease of 1.24±1 ° C with a hospital arrival temperature of 34.7 ° C , whereas the 62 patients not r and omized to cooling experienced a mean temperature increase of 0.10±0.94 ° C ( P<0.0001 ) with a hospital arrival temperature of 35.7 ° C . In-field cooling was not associated with adverse consequences in terms of blood pressure , heart rate , arterial oxygenation , evidence for pulmonary edema on initial chest x-ray , or rearrest . Secondary end points of awakening and discharged alive from hospital trended toward improvement in ventricular fibrillation patients r and omized to in-field cooling . Conclusions — These pilot data suggest that infusion of up to 2 L of 4 ° C normal saline in the field is feasible , safe , and effective in lowering temperature . We propose that the effect of this cooling method on neurological outcome after cardiac arrest be studied in larger numbers of patients , especially those whose initial rhythm is ventricular fibrillation Background The International Liaison Committee on Resuscitation ( ILCOR ) now recommends therapeutic hypothermia ( TH ) ( 33 ° C for 12 - 24 hours ) as soon as possible for patients who remain comatose after resuscitation from shockable rhythm in out-of-hospital cardiac arrest and that it be considered for non shockable rhythms . The optimal timing of TH is still uncertain . Laboratory data have suggested that there is significantly decreased neurological injury if cooling is initiated during CPR . In addition , peri-arrest cooling may increase the rate of successful defibrillation . This study aims to determine whether paramedic cooling during CPR improves outcome compared st and ard treatment in patients who are being resuscitated from out-of-hospital cardiac arrest . Methods / Design This paper describes the methodology for a definitive multi-centre , r and omised , controlled trial of paramedic cooling during CPR compared with st and ard treatment . Paramedic cooling during CPR will be achieved using a rapid infusion of large volume ( 20 - 40 mL/kg to a maximum of 2 litres ) ice-cold ( 4 ° C ) normal saline . The primary outcome measure is survival at hospital discharge . Secondary outcome measures are rates of return of spontaneous circulation , rate of survival to hospital admission , temperature on arrival at hospital , and 12 month quality of life of survivors . Discussion This trial will test the effect of the administration of ice cold saline during CPR on survival outcomes . If this simple treatment is found to improve outcomes , it will have generalisability to prehospital services globally . Trial Registration Clinical Trials.gov : Objective : Accurate measurement of temperature is vital in the intensive care setting . A prospect i ve trial was performed to compare the accuracy of tympanic , urinary , and axillary temperatures with that of pulmonary artery ( PA ) core temperature measurements . Design : A total of 110 patients were enrolled in a prospect i ve observational cohort study . Setting : Multidisciplinary intensive care unit of a university teaching hospital . Patients : The cohort was ( mean ± sd ) 65 ± 16 yrs of age , Acute Physiology and Chronic Health Evaluation ( APACHE ) II score was 25 ± 9 , 58 % of the patients were men , and 76 % were mechanically ventilated . The accuracy of tympanic ( averaged over both ears ) , axillary ( averaged over both sides ) , and urinary temperatures was referenced ( as mean difference , & Dgr ; degrees centi grade ) to PA temperatures as st and ard in 6,703 recordings . Lin concordance correlation ( pc ) and Bl and –Altman 95 % limits of agreement ( degrees centi grade ) described the relationship between paired measurements . Regression analysis ( linear mixed model ) assessed covariate confounding with respect to temperature modes and reliability formulated as an intraclass correlation coefficient . Measurements and Main Results : Concordance of PA temperatures with tympanic , urinary , and axillary was 0.77 , 0.92 , and 0.83 , respectively . Compared with PA temperatures , & Dgr ; ( limits of agreement ) were 0.36 ° C ( −0.56 ° C , 1.28 ° C ) , −0.05 ° C ( −0.69 ° C , 0.59 ° C ) , and 0.30 ° C ( −0.42 ° C , 1.01 ° C ) for tympanic , urinary , and axillary temperatures , respectively . Temperature measurement mode effect , estimated via regression analysis , was consistent with concordance and & Dgr ; ( PA vs. urinary , p = .98 ) . Patient age ( p = .03 ) , sedation score ( p = .0001 ) , and dialysis ( p = .0001 ) had modest negative relations with temperature ; quadratic relationships were identified with adrenaline and dobutamine . No interactions with particular temperature modes were identified ( p ≥ .12 for all comparisons ) and no relationship was identified with either mean arterial pressure or APACHE II score ( p ≥ .64 ) . The average temperature mode intraclass correlation coefficient for test – retest reliability was 0.72 . Conclusion : Agreement of tympanic with pulmonary temperature was inferior to that of urinary temperature , which , on overall assessment , seemed more likely to reflect PA core temperature Background — Transnasal evaporative cooling has sufficient heat transfer capacity for effective intra-arrest cooling and improves survival in swine . The aim of this study was to determine the safety , feasibility , and cooling efficacy of prehospital transnasal cooling in humans and to explore its effects on neurologically intact survival to hospital discharge . Methods and Results — Witnessed cardiac arrest patients with a treatment interval ≤20 minutes were r and omized to intra-arrest cooling with a RhinoChill device ( treatment group , n=96 ) versus st and ard care ( control group , n=104 ) . The final analysis included 93 versus 101 patients , respectively . Both groups were cooled after hospital arrival . The patients had similar demographics , initial rhythms , rates of byst and er cardiopulmonary resuscitation , and intervals to cardiopulmonary resuscitation and arrival of advanced life support personnel . Eighteen device-related adverse events ( 1 periorbital emphysema , 3 epistaxis , 1 perioral bleed , and 13 nasal discolorations ) were reported . Time to target temperature of 34 ° C was shorter in the treatment group for both tympanic ( 102 versus 282 minutes , P=0.03 ) and core ( 155 versus 284 minutes , P=0.13 ) temperature . There were no significant differences in rates of return of spontaneous circulation between the groups ( 38 % in treated subjects versus 43 % in control subjects , P=0.48 ) , in overall survival of those admitted alive ( 44 % versus 31 % , respectively , P=0.26 ) , or in neurologically intact survival to discharge ( Pittsburgh cerebral performance category scale 1 to 2 , 34 % versus 21 % , P=0.21 ) , although the study was not adequately powered to detect changes in these outcomes . Conclusions — Prehospital intra-arrest transnasal cooling is safe and feasible and is associated with a significant improvement in the time intervals required to cool patients . Clinical Trial Registration — URL : http://www . clinical trials.gov . Unique identifier : NCT00808236 BACKGROUND Therapeutic hypothermia ( TH ) represents an important method to attenuate post-resuscitation injury after cardiac arrest . Laboratory investigations have suggested that induction of hypothermia before return of spontaneous circulation ( ROSC ) may confer the greatest benefit . We hypothesized that a short delay in resuscitation to induce hypothermia before ROSC , even at the expense of more prolonged ischemia , may yield both physiological and survival advantages . METHODS Cardiac arrest was induced in C57BL/6 mice using intravenous potassium chloride ; resuscitation was attempted with CPR and fluid administration . Animals were r and omized into three groups ( n=15 each ) : a normothermic control group , in which 8 min of arrest at 37 degrees C was followed by resuscitation ; an early intra-arrest hypothermia group , in which 6.5 min of 37 degrees C arrest were followed by 90s of cooling , with resuscitation attempted at 30 degrees C ( 8 min total ischemia ) ; and a delayed intra-arrest hypothermia group , with 90s cooling begun after 8 min of 37 degrees C ischemia , so that animals underwent resuscitation at 9.5 min . RESULTS Animals treated with TH demonstrated improved hemodynamic variables and survival compared to normothermic controls . This was the case even when comparing the delayed intra-arrest hypothermia group with prolonged ischemia time against normothermic controls with shorter ischemia time ( 7-day survival , 4/15 vs. 0/15 , p<0.001 ) . CONCLUSIONS Short resuscitation delays to allow establishment of hypothermia before ROSC appear beneficial to both cardiac function and survival . This finding supports the concept that post-resuscitation injury processes begin immediately after ROSC , and that intra-arrest cooling may serve as a useful therapeutic approach to improve survival BACKGROUND Cardiac arrest outside the hospital is common and has a poor outcome . Studies in laboratory animals suggest that hypothermia induced shortly after the restoration of spontaneous circulation may improve neurologic outcome , but there have been no conclusive studies in humans . In a r and omized , controlled trial , we compared the effects of moderate hypothermia and normothermia in patients who remained unconscious after resuscitation from out-of-hospital cardiac arrest . METHODS The study subjects were 77 patients who were r and omly assigned to treatment with hypothermia ( with the core body temperature reduced to 33 degrees C within 2 hours after the return of spontaneous circulation and maintained at that temperature for 12 hours ) or normothermia . The primary outcome measure was survival to hospital discharge with sufficiently good neurologic function to be discharged to home or to a rehabilitation facility . RESULTS The demographic characteristics of the patients were similar in the hypothermia and normothermia groups . Twenty-one of the 43 patients treated with hypothermia ( 49 percent ) survived and had a good outcome --that is , they were discharged home or to a rehabilitation facility -- as compared with 9 of the 34 treated with normothermia ( 26 percent , P=0.046 ) . After adjustment for base-line differences in age and time from collapse to the return of spontaneous circulation , the odds ratio for a good outcome with hypothermia as compared with normothermia was 5.25 ( 95 percent confidence interval , 1.47 to 18.76 ; P=0.011 ) . Hypothermia was associated with a lower cardiac index , higher systemic vascular resistance , and hyperglycemia . There was no difference in the frequency of adverse events . CONCLUSIONS Our preliminary observations suggest that treatment with moderate hypothermia appears to improve outcomes in patients with coma after resuscitation from out-of-hospital cardiac arrest Background Animal studies suggest that the induction of therapeutic hypothermia in patients after cardiac arrest should be initiated as soon as possible after ROSC to achieve optimal neuroprotective benefit . A “ gold st and ard ” for the method of inducing hypothermia quickly and safely has not yet been established . In order to evaluate the feasibility of a hypothermia cap we conducted a study for the prehospital setting . Methods and results The hypothermia cap was applied to 20 patients after out-of-hospital cardiac arrest with a median of 10 min after ROSC ( 25/75 IQR 8–15 min ) . The median time interval between initiation of cooling and hospital admission was 28 min ( 19–40 min ) . The median tympanic temperature before application of the hypothermia cap was 35.5 ° C ( 34.8–36.3 ) . Until hospital admission we observed a drop of tympanic temperature to a median of 34.4 ° C ( 33.6–35.4 ) . This difference was statistically significant ( P < 0.001 ) . We could not observe any side effects related to the hypothermia cap . 25 patients who had not received prehospital cooling procedures served as a control group . Temperature at hospital admission was 35.9 ° C ( 35.3–36.4 ) . This was statistically significant different compared to patients treated with the hypothermia cap ( P < 0.001 ) . Conclusions In summary we demonstrated that the prehospital use of hypothermia caps is a safe and effective procedure to start therapeutic hypothermia after cardiac arrest . This approach is rapidly available , inexpensive , non-invasive , easy to learn and applicable in almost any situation Objective : Employing transnasal head-cooling in a pig model of prolonged ventricular fibrillation , we compared the effects of 4 hrs of head-cooling started during cardiopulmonary resuscitation with those of 8 hrs of surface-cooling started at 2 hrs after resuscitation on 96-hr survival and neurologic outcomes . Design : Prospect i ve controlled animal study . Setting : University-affiliated research laboratory . Subjects : Domestic pigs . Interventions : Twenty-four male pigs were subjected to 10 min of untreated ventricular fibrillation followed by 5 min of cardiopulmonary resuscitation . In the head-cooling group , hypothermia was started with cardiopulmonary resuscitation and continued for 4 hrs after resuscitation . In the surface-cooling group , systemic hypothermia with a cooling blanket was started , in accord with current clinical practice s , at 2 hrs after resuscitation and continued for 8 hrs . Methods in the control animal studies were identical except for temperature interventions . Measurements and Main Results : All animals were resuscitated except for one animal in each of the surface-cooling and control groups . After 5 min of cardiopulmonary resuscitation , jugular vein temperature was significantly decreased in the head-cooled animals . However , there were no differences in pulmonary artery temperatures among the three groups at that time . Nevertheless , both head-cooled and surface-cooled animals had an improved 96-hr survival after resuscitation . Significantly better neurologic outcomes were observed in early head-cooled animals in the first 3 days after resuscitation . Conclusion : Early head-cooling during cardiopulmonary resuscitation continuing for 4 hrs after resuscitation produced favorable survival and neurologic outcomes in comparison with delayed surface-cooling of 8 hrs duration BACKGROUND Experimental animal studies and previous r and omized trials suggest an improvement in mortality and neurologic function with induced hypothermia after cardiac arrest . International guidelines advocate the use of a target temperature management of 32 ° C to 34 ° C for 12 to 24 hours after resuscitation from out-of-hospital cardiac arrest . A systematic review indicates that the evidence for recommending this intervention is inconclusive , and the GRADE level of evidence is low . Previous trials were small , with high risk of bias , evaluated select population s , and did not treat hyperthermia in the control groups . The optimal target temperature management strategy is not known . METHODS The TTM trial is an investigator-initiated , international , r and omized , parallel-group , and assessor-blinded clinical trial design ed to enroll at least 850 adult , unconscious patients resuscitated after out-of-hospital cardiac arrest of a presumed cardiac cause . The patients will be r and omized to a target temperature management of either 33 ° C or 36 ° C after return of spontaneous circulation . In both groups , the intervention will last 36 hours . The primary outcome is all-cause mortality at maximal follow-up . The main secondary outcomes are the composite outcome of all-cause mortality and poor neurologic function ( cerebral performance categories 3 and 4 ) at hospital discharge and at 180 days , cognitive status and quality of life at 180 days , assessment of safety and harm . DISCUSSION The TTM trial will investigate potential benefit and harm of 2 target temperature strategies , both avoiding hyperthermia in a large proportion of the out-of-hospital cardiac arrest population Objective : To evaluate the effects on temperature and outcome at hospital discharge of a pre-hospital rapid infusion of large volume , ice-cold intravenous Hartmann 's solution in patients with out-of-hospital cardiac arrest and an initial cardiac rhythm of asystole or pulseless electrical activity . Design : Prospect i ve , r and omized , controlled clinical trial . Setting : Pre-hospital emergency medical service and 12 critical care units in Melbourne , Australia . Patients : One hundred and sixty three patients who had been resuscitated from cardiac arrest with an initial cardiac rhythm of asystole or pulseless electrical activity . Interventions : Patients were r and omized to either pre-hospital cooling using a rapid infusion of up to two litres ice-cold Hartmann 's solution ( 82 patients ) or cooling after hospital admission ( 81 patients ) . The planned duration of therapeutic hypothermia ( 32 ° C–34 ° C ) in both groups was 24 hrs . Measurements and Main Results : Patients allocated to pre-hospital cooling received a median of 1500 ml of ice-cold fluid . This result ed in a mean decrease in core temperature of 1.4 ° C compared with 0.2 ° C in hospital cooled patients ( p < .001 ) . The time to therapeutic hypothermia ( < 34 ° C ) was 3.2 hrs in the pre-hospital cooled group compared with 4.8 hrs in the hospital cooled group ( p = .0328 ) . Both groups received a mean of 15 hrs cooling in the hospital and only 7 patients in each group were cooled for 24 hrs . Overall , there was no difference in outcomes at hospital discharge with favorable outcome ( discharge from hospital to home or rehabilitation ) in 10 of 82 ( 12 % ) in the pre-hospital cooled patients , compared with 7 of 81 ( 9 % ) in the hospital cooled patients ( p = .50 ) . In the patients with a cardiac cause of the arrest , 8 of 47 patients ( 17 % ) who received pre-hospital cooling had a favorable outcome at hospital discharge compared with 3 of 43 ( 7 % ) in the hospital cooled group ( p = .146 ) . Conclusions : In adults who have been resuscitated from out-of-hospital cardiac arrest with an initial cardiac rhythm of asystole or pulseless electrical activity , pre-hospital cooling using a rapid infusion of large-volume , ice cold intravenous Hartmann 's solution decreases core temperature at hospital arrival and decreases the time to therapeutic hypothermia . In patients with a cardiac cause of the arrest , this treatment may increase the rate of favorable outcome at hospital discharge . Further larger studies should evaluate the effects of pre-hospital cooling when the initial cardiac rhythm is asystole or pulseless electrical activity , particularly in patients with a cardiac cause of the arrest
11,034
14,687,160
Dietary assessment s have frequently implicated fatty foods in symptom induction , and these findings are supported by laboratory-based studies , particularly the demonstration that FD patients more often experience symptoms after intraduodenal infusions of fat , than glucose .
Ut quod ali cibus estaliis fuat acre venenum“What is food to one manis bitter poison to others”Lucretius , 99 – 55 BCFunctional dyspepsia ( FD ) remains a relatively poorly characterized gastrointestinal disorder of unknown etiology that is frequently difficult to manage . A systematic review of the literature relating to food intake and FD is summarized here . Many patients with FD report symptoms after meal ingestion , including fullness , bloating , epigastric pain , nausea , and vomiting , and this has been interpreted as indicative of an underlying “ motor disorder of the stomach or small intestine . ” Such hypotheses are , however , still largely unsubstantiated , and the data that do exist are inconclusive , particularly as few studies have directly examined the temporal relationships between dyspeptic symptoms , meal ingestion , and disordered gastric motility . Moreover , studies attempting to relate symptoms to specific disturbances in gastric motor function have , in most cases , not evaluated symptoms concurrently with the function test , and /or have used suboptimal symptom scoring to quantify symptoms . Furthermore , the term “ early satiety ” has been used loosely as a symptom , rather than a quantitative measure of food intake . Currently , the most widely accepted mechanism underlying FD is visceral hypersensitivity , which may contribute to both enhanced motor and symptomatic responses to food ingestion .
Sumatriptan , a 5-hydroxytryptamine1 ( 5-HT1 ) receptor agonist at enteric neuronal 5-HT receptors , causes a relaxation of the gastric fundus and inhibition of antral contractile activity . The present study examined the effect of sumatriptan on gastric emptying of solids and liquids in humans . In eight healthy subjects the gastric emptying rate for liquids and solids was measured using the carbon-labeled glycine and octanoic acid breath test after subcutaneous administration of placebo or sumatriptan . Sumatriptan increased the gastric half-emptying time of liquids ( P < 0.0005 ) and induced a prolonged lag phase for liquids ( P < 0.0005 ) in all subjects . Sumatriptan increased gastric half-emptying time ( P < 0.005 ) and the lag phase of solids ( P < 0.05 ) in all subjects . In two healthy subjects gastric emptying of liquids and solids after subcutaneous administration of sumatriptan was studied by radioscintigraphy . Radioscintigraphy confirmed the delayed emptying and the prolonged lag phases after sumatriptan . In conclusion , sumatriptan delays gastric emptying of solids and liquids in healthy subjects . Moreover , sumatriptan induces a lag phase for liquids . The mechanism by which sumatriptan alters gastric emptying remains to be studied To verify the influence of food consistency on satiety mechanisms we evaluated the effects of the same meal in solid-liquid ( SM ) and homogenized ( HM ) form on satiety sensation , gastric emptying rate and plasma cholecystokinin ( CCK ) concentration . Eight healthy men , aged 21 - 28 ( mean 24.5 ) years were given two meals ( cooked vegetables 250 g , cheese 35 g , croutons 50 g and olive oil 25 g , total energy 2573 kJ , with water 300 ml ) differing only in physical state : SM and HM . The subjects consumed the meals in r and omized order on non-consecutive days . The sensations of fullness , satiety and desire to eat were evaluated by means of a question naire , gastric emptying was assessed by ultrasonographic measurement of antral area , and plasma CCK concentration was measured by radioimmunoassay . The vegetable-rich meal was significantly more satiating ( P < 0.05 ) when in the HM form than when eaten in a SM state . Furthermore , the overall gastric emptying time was significantly slowed ( 255 ( SEM 11 ) min after HM v. 214 ( SEM 12 ) min after SM ; P < 0.05 ) and CCK peak occurred later ( 94 ( SEM 12 ) min after HM v. 62 ( SEM 11 ) min after SM ; NS ) when the food was consumed in the HM form . Independently of the type of meal , antral area was significantly related to fullness sensations ( r2 0.46 , P = 0.004 ) . These results demonstrate that meal consistency is an important physical food characteristic which influences both gastric emptying rate and satiety sensation . Moreover , the relationship observed between antral area and fullness sensation confirms that antral distension plays a part in the regulation of eating behaviour To ascertain the effect of gastric emptying on the symptoms of non-ulcer dyspepsia ( NUD ) patients , we r and omly selected 60 NUD patients and , as control , 26 dyspepsia-free volunteers . We measured the gastric emptying time of mixed food ( 270 kcal ) , using real-time ultrasonography in two ways . NUD patients were divided r and omly into two groups and given domperidone or placebo in a double-blind trial . Of the NUD patients 48 % had delayed gastric emptying times and associated epigastric pain , bloating , early satiety , and regurgitation . A prokinetic agent not only improved emptying time but also relieved some of the symptoms of the NUD patients . Real-time ultrasonography proved a useful method for evaluating gastric emptying Duodenal lipid exacerbates gastrointestinal sensations during gastric distension . Using luminal application of the local anesthetic benzocaine , we investigated the role of intestinal receptors in the induction of these sensations . Nine healthy subjects were studied on five occasions , during which isotonic saline or 20 % lipid ( 2 kcal/min ) , combined with ( duodenal or jejunal ) 0.75 % benzocaine or vehicle at 2.5 ml/min , was infused intraduodenally before and during gastric distension . Intragastric pressures and volumes , gastrointestinal sensations , and plasma CCK levels were determined . Duodenal lipid combined with vehicle increased gastric volume ( in ml : saline , -10 + /- 18 ; lipid/vehicle , 237 + /- 30 ) and plasma CCK [ mean levels ( pmol/l ) : saline , 2.0 + /- 0 . 2 ; lipid/vehicle , 8.0 + /- 1.6 ] and , during distensions , induced nausea ( scores : saline , 3 + /- 2 : lipid/vehicle , 58 + /- 19 ) and decreased pressures at which fullness and discomfort occurred . Duodenal but not jejunal benzocaine attenuated the effect of lipid on gastric volume , plasma CCK , and nausea during distension ( 135 + /- 38 and 216 + /- 40 ml , 4.6 + /- 0.6 pmol/l and not assessed , and 37 + /- 12 and 64 + /- 21 for lipid + duodenal benzocaine and lipid + jejunal benzocaine , respectively ) and on pressures for sensations . In conclusion , intestinal receptors modulate gastrointestinal sensations associated with duodenal lipid and gastric distension . There is also the potential for local neural mechanisms to regulate CCK release and thereby reduce afferent activation indirectly Sumatriptan , a 5HT1 receptor agonist , inhibits antral motor activity , delays gastric emptying and relaxes the gastric fundus . The aim of this study was to characterize the effect of sumatriptan on transpyloric flow and gastric accommodation during and immediately after ingestion of a liquid meal using duplex sonography . Ten healthy subjects were investigated twice on separate days . In r and om order either sumatriptan 6 mg ( Imigran 0.5 mL ) or a placebo were given s.c . 15 min before ingesting 500 mL of a meat soup . The subjects were examined during the 3-min period before ingestion of the liquid meal , the 3-min spent drinking the meal and 10 min postpr and ially . Sumatriptan caused a significant widening of both the gastric antrum ( P=0.02 ) and the proximal stomach ( P=0.01 ) 10 min postpr and ially as compared with placebo . It caused no significant differences in time to initial gastric emptying ( P=0.2 ) , but significantly delayed commencement of peristaltic-related transpyloric flow ( P=0.04 ) . Sumatriptan had no significant effect on mean abdominal symptom scores , but after sumatriptan there was a significant negative correlation between width of postpr and ial antral area and postpr and ial nausea and between width of postpr and ial antral area and postpr and ial bloating . We therefore conclude that sumatriptan causes a postpr and ial dilatation of both the distal and the proximal stomach with no change in dyspeptic symptoms nor in length of time to first gastric emptying . Time to commencement of peristaltic-related emptying is delayed The role of cholecystokinin ( CCK ) in the regulation of gastric emptying and pancreatic enzyme secretion was evaluated by infusing the CCK-receptor antagonist loxiglumide . Gastric emptying rates and pancreatic secretory outputs were measured in five healthy volunteers by the double-indicator perfusion technique using a multiple-lumen tube in the duodenum . Placebo or loxiglumide ( 22 mumol.kg-1.h-1 ) was infused throughout each experiment . Five hundred-milliliter liquid intragastric meals of ( a ) fat , protein , and glucose ( Ensure ; Abbott , Chicago , IL ) ; ( b ) glucose , 20 g/dL ; and ( c ) guar gum , 1.1 g/dL , were given in r and om order . In addition , the effect of a physiologic CCK-8 dose ( 20 pmol.kg-1.h-1 ) after an intragastric 500-mL saline meal ( 0.154 mol/L ) was tested . Intravenous CCK-8 induced a marked retardation of the gastric emptying rate of the saline solution ( P less than 0.05 ) while stimulating pancreatic secretory outputs ; both effects were completely abolished by the infusion of loxiglumide . Loxiglumide markedly accelerated the gastric emptying rates ( by approximately 40 % ) and simultaneously diminished lipase ( by approximately 75 % ) and trypsin ( by approximately 50 % ) outputs of both the mixed meal ( P less than 0.01 ) and the pure glucose meal ( P less than 0.05 ) . Additional experiments using gamma camera scintigraphy confirmed the accelerating effect of loxiglumide on gastric emptying of the mixed meal ( P less than 0.01 ) . The gastric emptying rate of the guar meal , which did not release CCK , was not influenced by the infusion of loxiglumide . Loxiglumide distinctly augmented plasma CCK levels after the mixed ( 2.6 times ) and the pure glucose ( 2.1 times ) meals while markedly reducing ( approximately 76 % ) pancreatic polypeptide release ( P less than 0.02 ) . It is concluded that endogeneous CCK exerts a major role in the regulation of both gastric liquid emptying and pancreatic secretion in humans The factors influencing appetite in humans are poorly understood . There is a weak relation between appetite and gastric emptying in normal subjects . Recent studies have shown that fasting and postpr and ial antral areas increase in patients with functional dyspepsia compared with normal subjects . We evaluated the hypothesis that antral area , and hence antral distention , is a significant determinant of postpr and ial fullness . Fourteen normal subjects had simultaneous measurements of gastric emptying by scintigraphy and antral area by ultrasound after ingestion of 350 mL 20 % glucose . Fullness and hunger were assessed by visual analog scales . Measurements of the gastric-emptying half time ( t1/2 ) by scintigraphy and ultrasound were not significantly different ( 129.6 + /- 11.8 min compared with 115.6 + /- 11.4 min ) . Fullness increased ( P < 0.001 ) and hunger decreased ( P < 0.001 ) after the drink . Both fullness and the magnitude of the increase in fullness after the drink were related to antral area ( r > 0.56 , P < 0.05 ) , the increase in antral area ( r > 0.59 , P < 0.05 ) , and the scintigraphic content of the distal stomach ( r > 0.57 , P < 0.05 ) , but not to the ultrasound or scintigraphic t1/2 values . In contrast , hunger and the magnitude of the decrease in hunger after the drink were not related to either antral area , the increase in antral area , or the rate of gastric emptying . We conclude that postpr and ial fullness , but not hunger , was closely related to antral distention in normal subjects BACKGROUND : Peripheral administration of glucagon-like peptide-1 ( GLP-1 ) for four hours , to normal weight and obese humans , decreases food intake and suppresses appetite . OBJECTIVE : The aim of this study was to assess the effect of an eight hour infusion of GLP-1 on appetite and energy intake at lunch and dinner in obese subjects . DESIGN : R and omised , blinded cross-over design with intravenous infusion of GLP-1 ( 0.75 pmol·kg−1·min−1 ) or saline . SUBJECTS : Eight obese ( body mass index , BMI , 45.5±2.3 kg/m2 ) male subjects . MEASUREMENTS : Ad libitum energy intake at lunch ( 12.00 h ) and dinner ( 16.00 h ) after an energy fixed breakfast ( 2.4 MJ ) at 08.00 h. Appetite sensations using visual analogue scales , ( VAS ) immediately before and after meals and hourly in-between . Blood sample s for the analysis of glucose , insulin , C-peptide , GLP-1 and peptide YY . Gastric emptying after breakfast and lunch using a paracetamol absorption technique . RESULTS : Hunger ratings were significantly lower with GLP-1 infusion . The summed ad libitum energy intake at lunch and dinner was reduced by 1.7±0.5 MJ ( 21±6 % ) by GLP-1 infusion ( P=0.01 ) . Gastric emptying was delayed by GLP-1 infusion , and plasma glucose concentrations decreased ( baseline : 6.6±0.35 mmol/L ; nadir : 5.3±0.15 mmol/L ) . No nausea was recorded during GLP-1 infusion . CONCLUSIONS : Our results demonstrate that GLP-1 decreases feelings of hunger and reduces energy intake in obese humans . One possible mechanism for this finding might be an increased satiety primarily mediated by gastric vagal afferent signals BACKGROUND & AIMS Paying attention to the gut may magnify perception of abdominal symptoms , but the actual influence of attention by anticipatory knowledge and distraction on gut perception remains poorly defined . The aim of this study was to determine whether mental activity , attention vs. distraction , affects intestinal perception and whether mental effects are synergistic with other modulatory mechanisms . METHODS Perception of 1-minute intestinal balloon distentions applied at 7 - 13-minute r and om intervals was measured in healthy subjects . First , distentions were tested during attention by anticipatory knowledge and during distraction ( n = 8) . Because somatic transcutaneous electrical nerve stimulation ( TENS ) reduces gut perception , distentions were then tested during attention alone , attention plus somatic TENS , and during distraction plus TENS ( n = 8) . RESULTS Perception of intestinal distentions was higher during attention than during distraction ( 3.3 + /- 0.2 vs. 2.9 + /- 0.1 [ mean + /- SEM ] ; P < 0.05 ) . The area of somatic projection was greater during attention ( P < 0.05 ) . Intestinal compliance and oral reflex relaxation remained unchanged . During application of somatic TENS , perception of intestinal distention was higher during attention than distraction ( 2.4 + /- 0.3 vs. 1.7 + /- 0.2 ; P < 0.05 ) . However , TENS did not alter the perception score during attention . CONCLUSIONS Mental activity may modulate gut perception and overrides the effects of somatic TENS on gut perception Satiation , the process that brings eating to an end , and satiety , the state of inhibition over further eating , may be influenced by cholecystokinin ( CCK ) . In animal and human studies , it has been shown that infusion of exogenous CCK decreases food intake , but the doses given may well have led to supraphysiological plasma concentrations . This study was done to discover if a low dose of intraduodenal fat releasing physiological amounts of endogenous cholecystokinin exerts satiation or satiety effects , or both and if these effects could be inhibited by the CCK receptor antagonist loxiglumide . In 10 healthy lean volunteers ( 5 F , 5 M , mean age 26 ) three tests were performed in a r and omised blind fashion . Intralipid 20 % ( 6 g/h ) ( experiments A and C ) or saline ( experiment B ) were given intraduodenally from 1030 until 1300 . The subjects received saline ( experiments A and B ) or loxiglumide ( experiment C ) a specific CCK-receptor antagonist ( 10 mg/kg/h ) intravenously from 0930 until 1300 . At 1200 a meal was served . At regular time intervals hunger feelings were measured using visual analogue scales and food selection lists and plasma CCK was measured by radioimmunoassay . Food intake ( mean ( SEM ) ) during intraduodenal fat ( 206(35)g ) was lower than in the control study ( 269(37)g , p = 0.09 ) . Loxiglumide largely prevented the inhibitory effect of intraduodenal fat on food intake ( 245(30)g ) . From 1030 until the meal at 1200 there was a significant satiating effect of intraduodenal fat compared with the control and loxiglumide experiments according to the food selection lists , which was because of the satiating effect for the fat rich items ( p<0.05 ) . Also feelings of fullness were significantly higher during intraduodenal fat than in the control or loxiglumide experiments ( p<0.05 ) . During intraduodenal fat there was a significant increase of plasma CCK from 2.4(0.3 ) to 4.8(0.4 ) pM ( p<0.001 ) . Loxiglumide led to an exaggerated CCK release to a peak concentration of 16(2.4 ) pM before the meal . This study shows that in humans low dose intraduodenal fat increases satiety and satiation , mainly through the effect of CCK Objective Different subgroups can be identified in functional dyspepsia based on symptom type or severity , and may correlate with pathophysiological disturbances . In particular , female sex and severe fullness and vomiting have been reported to be strong independent predictors of slow solid gastric emptying . We aim ed to determine if symptom patterns or severity could identify those with abnormal gastric emptying among patients with dysmotility-like functional dyspepsia and , for comparison , type I diabetes mellitus . METHODS : Patients with postpr and ial symptoms and documented functional dyspepsia by endoscopy ( n = 551 ) and patients with type I diabetes who had postpr and ial dyspepsia ( n = 247 ) enrolling in two separate r and omized controlled trials were evaluated at baseline . Patients were assigned to either the delayed or normal gastric emptying strata , based on a vali date d C13 octanoic acid breath test with sampling over 4 h. A self-report question naire measured the presence and severity of eight symptoms on visual analog scales . The vali date d Nepean Dyspepsia Index measured the frequency , severity , and bothersomeness of 15 upper GI symptoms on Likert scales . RESULTS : Gastric emptying was definitely delayed ( t1/2 > 192 min ) in 24 % of patients with functional dyspepsia and 28 % with diabetes . Delayed gastric emptying was associated with female gender but not age or Helicobacter pylori status . The age- and sex-adjusted risk ( odds ratio ) of delayed gastric emptying for the upper GI symptoms ranged from 0.99 to 1.0 ( all p values ≥0.2 ) . The results were very similar in functional dyspepsia and diabetes . There was also no correlation between t1/2 and number of symptoms or symptom severity scores . CONCLUSION : Symptom prevalence and severity were similar in dyspeptic patients with and without delayed gastric emptying . Specific symptoms do not seem to be of predictive value in dysmotility-like dyspepsia for identifying alterations of gastric emptying It has been proposed that patients with dyspepsia can be classified into symptom groupings that may represent different pathophysiological entities ; however , it remains to be shown that distinct symptom subgroups exist . To estimate the prevalence of dyspepsia ( defined as upper abdominal pain ) and dyspepsia subgroups , an age- and sex-stratified r and om sample of Olmsted County , Minnesota , residents , aged 30 - 64 years , were mailed a valid self-report question naire ; 82 % responded ( n = 835 ) . Subgroups were as follows : those with symptoms suggestive of peptic ulceration ( ulcerlike dyspepsia ) , those with gastric stasis ( dysmotilitylike dyspepsia ) , those with gastroesophageal reflux ( refluxlike dyspepsia ) , and the remainder ( unspecified dyspepsia ) . Ulcerlike dyspepsia was the commonest subgroup ( prevalence , 16.0/100 ; 95 % confidence interval , 13.4 - 18.5 ) , but 43 % of subjects with dyspepsia could be classified into more than one subgroup . Nearly one third of dyspeptics also had irritable bowel symptoms , but these were not confined to any particular dyspepsia subgroup . Although dyspepsia is very common in the community and the majority have ulcerlike symptoms , there is such overlap among the dyspepsia subgroups that a classification based on symptoms alone in uninvestigated patients may not be useful Twenty-eight patients with chronic idiopathic dyspepsia defined by the presence of chronic unexplained symptoms suggestive of gastric stasis and directly related to food ingestion were included in this prospect i ve study . Gastric emptying of the liquid and solid phases of a meal was quantified by a dual-isotope method , and symptoms were evaluated by a diary and a visual analog scale . Delay in gastric emptying was evidence d in 59 % of the dyspeptic patients ; it occurred with liquids in more cases than solids . Quantitative and qualitative evaluation of symptoms was of no practical value in predicting the presence of objective stasis . The dyspeptic patients were included in a double- blind r and omized controlled trial of cisapride , a new gastrokinetic drug devoid of central antiemetic effects . After six weeks of cisapride treatment , all patients with initially abnormal gastric emptying rates for liquids , and all but one for solids returned to normal ranges , and significant differences between cisapride and placebo groups were observed for half emptying times of both solids ( 136±16 min vs 227 ±32 min ; P<0.02 ) and liquids ( 61±4 min vs 132±37 min ; P<0.01 ) . Cisapride also significantly improved dyspeptic symptom scores at weeks 3 and 6 of treatment as compared to those measured before treatment . Nevertheless , the decrease in global diary score was significantly higher than that seen with placebo at week 3 ( −16±6 vs −1±9 ; P<0.05 ) , but not at week 6 ( −18±5 vs −10±8 ) . The symptomatic effect of cisapride at week 3 was significantly more pronounced in patients with abnormal initial gastric emptying than in those with normal gastric emptying ( −30±7vs −4±6 ; P<0.02).These results underline the importance of objective evaluation of gastric emptying in the detection of patients with gastric stasis who exhibited the best symptomatic response to cisapride In a double blind crossover comparison with placebo , the effects of cisapride ( 10 mg tid for two weeks ) , a non-antidopaminergic gastrointestinal prokinetic drug , on gastric emptying times and on symptoms were evaluated in 12 patients with chronic idiopathic dyspepsia and gastroparesis . Gastric emptying was studied by a radioisotopic gamma camera technique . The test meal was labelled in the solid component ( 99mTc-sulphur colloid infiltrated chicken liver ) . Nine symptoms ( nausea , belching , regurgitations , vomiting , postpr and ial drowsiness , early satiety , epigastric pain or burning , heartburn ) were grade d weekly on a question naire . Cisapride was significantly more effective than placebo in shortening the t1/2 of gastric emptying ( p2 = 0.04 ) , but no significant difference was observed between the two treatments with regard to the improvement of total symptom score ( p2 = 0.09 ) . No side effects were reported during the study Functional gastrointestinal disorders , including the irritable bowel syndrome , account for up to 40 % of referrals to gastroenterologists , but accurate data on the natural history of these disorders in the general population are lacking . Using a reliable and valid question naire , the authors estimated the onset and disappearance of symptoms consistent with functional gastrointestinal disorders . An age- and sex-stratified r and om sample of 1,021 eligible residents of Olmsted County , Minnesota , aged 30 - 64 years were initially mailed the question naire ; 82 % responded ( n = 835 ) . In a remailing to responders 12 - 20 months later , 83 % responded again ( n = 690 ) . The age- and sex-adjusted prevalence rates per 100 for irritable bowel syndrome , chronic constipation , chronic diarrhea , and frequent dyspepsia were 18.1 ( 95 % confidence interval ( CI ) 15.1 - 21.1 ) , 14.7 ( 95 % CI 11.9 - 17.4 ) , 7.3 ( 95 % CI 5.3 - 9.3 ) , and 14.1 ( 95 % CI 11.5 - 16.8 ) , respectively , on the second mailing . Symptoms were not significantly associated with nonresponse to the second mailing ; moreover , the estimated prevalence rates were not significantly different from the first mailing . Among the 582 subjects free of the irritable bowel syndrome on the first survey , 9 % developed symptoms during 795 person-years of follow-up , while 38 % of the 108 who initially had the irritable bowel syndrome did not meet the criteria after 146 person-years of follow-up . Similar onset and disappearance rates were observed for the other main symptom categories . While functional gastrointestinal symptoms are common in middle-aged persons and overall prevalence appears relatively stable over 12 - 20 months , substantial turnover is implied by the observed onset and disappearance rates ; several potential sources of bias do not seem to account for this variation The effects of domperidone , a peripherally acting dopamine antagonist , were compared with those of placebo in a double-blind r and omized study in 16 patients with idiopathic gastric stasis , chronic symptoms of “ nonulcer dyspepsia ” ( including nausea , vomiting , and abdominal pain ) , and altered gastroduodenal motility . Patients received either domperidone or placebo orally ( 20 mg before meals and at bedtime ) for six weeks . Symptoms were assessed by daily diaries kept by the patients for two weeks while receiving no medication for their gastrointestinal complaints ( baseline ) , and throughout the six-week treatment phase . Studies of gastric emptying of a radiolabeled solid-phase meal were performed at baseline and six weeks after treatment . All patients had delayed gastric emptying at baseline , defined as a half-emptying time of more than mean + 1 sd ( from studies of normal controls ) . An 18- to 24-hr recording of gastroduodenal motor function during fasting was also performed at baseline and after six weeks of either domperidone or placebo treatment . After six weeks of treatment , the symptom scores significantly improved in the domperidone group ( P<0.05 ) , but not in the placebo group . Gastroduodenal motor activity was unchanged from baseline recordings after six weeks . Solid-phase gastric emptying also showed no improvement in either the domperidone or placebo group of patients . Although domperidone therapy had no significant effect on motility , it appears to be an effective drug for the treatment of the symptoms of nonulcer dyspepsia Background —Abnormal visceral mechanosensory and vagal function may play a role in the development of functional gastrointestinal disorders . Aims —To assess whether vagal efferent and afferent function is linked with small intestinal mechanosensory function . Methods —In seven patients with functional dyspepsia , six patients with a history of Billroth I gastrectomy and /or vagotomy , and seven healthy controls , intestinal perception thresholds were tested by a r and omised ramp distension procedure performed with a barostat device . On a separate day , an insulin hypoglycaemia test was performed to assess the plasma levels of pancreatic polypeptide ( PP ) in response to hypoglycaemia , as a test of efferent vagal function . Results —First perception of intestinal balloon distension occurred at significantly lower pressures in patients with functional dyspepsia ( median 19.3 , range 14.7–25.3 mm Hg ) compared with healthy controls ( median 26.0 , range 21.7–43.7 mm Hg , p<0.01 ) . Sensory thresholds were significantly lower in patients after gastrectomy ( median 12.2 , range 8.0–14.7 mm Hg , p<0.05 versus all others ) . In healthy controls and patients with functional dyspepsia , insulin hypoglycaemia significantly ( p<0.001 ) increased plasma PP levels . However , only two out of seven patients with functional dyspepsia had a more than twofold increase in PP values whereas all healthy controls had a more than twofold increase in PP levels after insulin hypoglycaemia ( p<0.05 ) . In contrast , there was no significant PP response in the gastrectomised patients ( median 2 % , range −10 to + 23 % ) . PP responses and visceral sensory thresholds were significantly correlated ( r=0.65 , p<0.002 ) . Conclusions —The diminished PP response after insulin hypoglycaemia indicates disturbed efferent vagal function in a subgroup of patients with functional dyspepsia . The data also suggest that the intact vagal nerve may exert an antinociceptive visceral effect The purpose of this study was to investigatewhether sublingual glyceryl trinitrate influences thesize of the proximal stomach and postpr and ial symptomsin patients with functional dyspepsia . Twenty patients with functional dyspepsia were included in adouble-blind , placebo-controlled crossover study withsublingual glyceryl trinitrate . All patients werescanned twice on consecutive days , receiving eitherplacebo or 0.5 mg glyceryl trinitrate r and omly 5 minprior to ingestion of 500 ml meat soup . Total symptoms , pain , nausea , and bloating were scored on a visualanalog scale before and after the meal . St and ardized ultrasonograms were obtained 1 , 10 , and 20 minpostpr and ially of the proximal and distal stomach . Theproximal stomach was larger in the sagittal section at1 min postcibally ( 26.5 ± 3.9 vs 24.8 ±4.9 cm2 , P = 0.036 ) and 10 min postpr and ially ( 22.0± 5.1 vs 19.8 ± 5.3 cm2 , P = 0.009 ) after administration of glyceryl trinitratecompared with placebo , whereas a tendency was observedafter 20 min ( 18.7 ± 5.5 vs 17.3 ± 5.7 cm2 , P = 0.076 ) . The correspondingchanges in the frontal diameters were 8.3 ± 1.1vs 7.8 ± 1.2 cm ( P = 0.067 ) after 1 min , 7.2± 0.9 vs 6.4 ± 0.8 cm ( P = 0.001 ) after 10min , and 6.3 ± 1.1 vs 5.6 ± 1.2 cm ( P = 0.016 ) after 20 min . The area of the distal stomach was notdifferent ( P = 0.31 ) in the two groups . Afteradministration of glyceryl trinitrate , the patients reported less pain ( P = 0.048 ) and nausea ( P = 0.023 ) 5min postpr and ially , but this effect was reduced 15min later . Total symptom score was improved by glyceryltrinitrate treatment ( P = 0.042 ) . Sublingual glyceryltrinitrate improves accommodation of the proximal stomach to a meal and reduces postpr and ialsymptoms in a group of patients with functionaldyspepsia This study was done to determine the relative effects of energy content and weight of ingested food on subsequent satiety and food intake . The weight/volume and the energy content of nine preloads were manipulated , in a 3 x 3 factorial design , to give three weight levels , 250 , 500 and 750 g , and three energy levels 0 , 1.26 and 2.51 MJ ( 0 , 300 and 600 kcal ) . The weights were varied by the addition of water , while the energy levels were varied by using yogurt and cream . Each of the 1.26 and 2.51 MJ preloads contained 27 g of protein and 31 g of carbohydrates . The 1.26 MJ preloads contained 8 g of fat and the 2.51 MJ preloads had 41 g of fat . Each of the nine preloads was presented as a lunch to 21 female and 16 male subjects . Two hours after the preloads , subjects consumed sweet and savory snacks and various drinks ad libitum from a buffet . The weight of the preload had a small but statistically significant effect on feelings of hunger and satiety between preload and buffet , and on energy intake during the buffet ( 5.34 , 5.05 and 5.04 MJ after 250 , 500 and 750 g preloads ) . There was a large difference between 0 and 1.26 MJ , but little difference in effect between 1.26 and 2.51 MJ preloads . Mean energy intakes in the buffet after the 0 , 1.26 and 2.51 MJ preloads were 6.17 , 4.83 and 4.42 MJ . These results suggest that the weight or amount of food affects subsequent appetite and food intake , but the effect of energy is stronger Duodenal lipid causes gastric relaxation , CCK secretion , and nausea . Vasopressin has been implicated in motion sickness-related nausea . We hypothesized that increasing doses of lipid enhance gastric relaxation and CCK-vasopressin secretion , result ing in a dose-related exacerbation of nausea . Nine healthy subjects received isotonic saline or lipid ( 1 , 2 , or 3 kcal/min , L1 , L2 , L3 ) duodenally . Changes in gastric volume , sensations , and plasma hormone levels were assessed during infusions and isobaric gastric distensions . Lipid infusions increased gastric volume , plasma CCK ( but not vasopressin ) levels , and gastric compliance during distensions , compared with saline . Plasma CCK levels were related to the dose of lipid administered [ CCK levels at 30 min ( pmol/l ) , saline : 1.1 + /- 0.2 , L1 : 1.8 + /- 0.2 , L2 : 3.0 + /- 0.2 , L3 : 4.3 + /- 0.6 ] . During distensions , nausea increased in intensity with increasing doses of lipid [ score ( where 0 is no sensation and 100 is strongest sensation ) , saline : 7 + /- 4 , L1 : 19 + /- 7 , L2 : 44 + /- 7 , L3 : 66 + /- 8 ] ; however , no further rise in plasma CCK occurred . Because neither lipid nor distension alone induced significant nausea , we conclude that the interaction between these stimuli together with a modulation by CCK is responsible for the effects observed . Vasopressin is not involved in lipid- and distension-induced nausea Ghrelin is a recently identified endogenous lig and for the growth hormone secretagogue receptor . It is synthesized predominantly in the stomach and found in the circulation of healthy humans . Ghrelin has been shown to promote increased food intake , weight gain and adiposity in rodents . The effect of ghrelin on appetite and food intake in man has not been determined . We investigated the effects of intravenous ghrelin ( 5.0 pmol/kg/min ) or saline infusion on appetite and food intake in a r and omised double-blind cross-over study in nine healthy volunteers . There was a clear-cut increase in energy consumed by every individual from a free-choice buffet ( mean increase 28 + /- 3.9 % , p<0.001 ) during ghrelin compared with saline infusion . Visual analogue scores for appetite were greater during ghrelin compared to saline infusion . Ghrelin had no effect on gastric emptying as assessed by the paracetamol absorption test . Ghrelin is the first circulating hormone demonstrated to stimulate food intake in man . Endogenous ghrelin is a potentially important new regulator of the complex systems controlling food intake and body weight Background and aims : Dietary fat plays a role in the pathophysiology of symptoms in functional dyspepsia ( FD ) . In healthy subjects , cognitive factors enhance postpr and ial fullness ; in FD patients , attention increases gut perception . We hypothesised that the information given to patients about the fat content of a meal would affect dyspeptic symptoms . Methods : Fifteen FD patients were each studied on four occasions in a r and omised double blind fashion . Over two days they ingested a high fat yoghurt ( HF ) and over the other two days a low fat yoghurt ( LF ) . For each yoghurt , the patients received the correct information about its fat content on one day ( HF-C , LF-C ) and the opposite ( wrong ) information on the other day ( HF-W , LF-W ) . Dyspeptic symptoms , plasma cholecystokinin ( CCK ) concentrations , and gastric volumes were evaluated . Results : Both the fat content and information about the fat content affected fullness and bloating scores — both were higher after HF-C compared with LF-C , and LF-W compared with LF-C , with no differences between HF-C and HF-W. Nausea scores were higher after HF compared with LF , with no effect of the information about fat content . No differences between discomfort and pain scores were found between study conditions . Plasma CCK and gastric volumes were greater following HF compared with LF , with no effect of the information given to the patients . All differences are p<0.05 . Conclusions : Cognitive factors contribute to symptom induction in FD . Low fat foods may also elicit symptoms if patients perceive foods as high in fat , while CCK and gastric volumes do not appear to be affected by cognitive factors Symptoms suggesting gastroparesis in patients without gastric outlet obstruction are very common but their relation to an objective delay of gastric emptying has been poorly investigated . A dual isotopic technique was used to evaluate patients with non-obstructive dyspepsia ( idiopathic and secondary ) ( part 1 ) and to assess the effects of a new gastrokinetic agent : cisapride , on gastric emptying in such patients ( part 2 ) . Sixty patients with postpr and ial dyspeptic symptoms ( vomiting , nausea , gastric bloating or full feeling ) and without lesions at upper endoscopy were studied . They were distributed into three groups : idiopathic dyspepsia ( n = 31 ) , postvagotomy dyspepsia ( n = 16 ) and dyspepsia secondary to medical disorders ( n = 13 ) . All patients ingested the same ordinary meal ; 99mTc sulphur colloid tagged egg white was the solid phase marker and 111In chloride was the liquid phase marker . In part 1 , evaluation of gastric emptying in the first 50 patients shows a delay of gastric emptying rate of solids and liquids as compared with controls . Striking differences separate the three groups of patients , however , percentages of delayed gastric emptying rate of solids and or liquids averaged 90 % in postvagotomy or secondary dyspepsia groups whereas it was 44 % in idiopathic dyspepsia group . Moreover , liquid emptying rate was often the only one impaired in idiopathic dyspepsia , and in 12 of the 27 patients of this group the faster emptying rate of liquids as compared with that of solids ( always found in normal subjects ) , could not be evidence d. In part 2 , 10 patients entered a double blind cross over study of cisapride ( 8 mg intravenously ) . A significant increase of solid ( p<0.01 ) and liquid ( p<0.05 ) emptying rates was found in patients with initial gastric emptying delay . This study emphasises the importance of an objective evaluation of gastric emptying in the presence of symptoms of gastric stasis and suggests that specific local acting therapy may be useful in patients with identified abnormal gastric emptying BACKGROUND / AIMS We aim ed to evaluate the role of fat and cholecystokinin ( CCK ) in the pathophysiology of functional dyspepsia ( FD ) by investigating symptoms and plasma CCK levels following increasing doses of duodenal lipid during gastric distension , and the effect of CCK-A receptor blockade . SUBJECTS/ METHODS In study A , six FD patients were studied on three occasions during duodenal infusion of saline or lipid ( 1.1 ( L-1 ) or 2 kcal/min ( L-2 ) ) and proximal gastric distensions . Six healthy subjects were also studied as controls during L-2 only . In study B , the effect of the CCK-A antagonist dexloxiglumide ( 5 mg/kg/h ) on L-2 induced symptoms was studied in 12 FD patients . Changes in gastric volume at minimal distending pressure and plasma CCK ( study A ) were assessed , gastric distensions were performed using a barostat , and dyspeptic symptoms were monitored . RESULTS Lipid increased gastric volume compared with saline ( ΔV ( ml ) : saline 15 ( 20 ) , L-1 122 ( 42 ) , L-2 114 ( 28 ) ) in patients and even more so in controls ( 221 ( 37 ) ; p<0.05 ) . During distensions , symptoms were greater during L-2 than during saline or L-1 , and greater in patients than in controls , while gastric compliance was smaller in patients than in controls ( p<0.05 ) . Lipid increased plasma CCK levels in patients and controls ( p>0.05 ) . Dexloxiglumide abolished the increase in gastric volume ( ΔV ( ml ) : dexloxiglumide 17 ( 9 ) , placebo 186 ( 49 ) ) and dyspeptic symptoms ( sum of scores : dexloxiglumide 24 ( 7 ) , placebo 44 ( 19 ) ) during duodenal lipid infusion . Dexloxiglumide also reduced gastric compliance ( ml/mm Hg : dexloxiglumide 51 ( 7 ) , placebo 72 ( 11 ) ) and symptoms ( sum of scores : dexloxiglumide 101 ( 17 ) , placebo 154 ( 21 ) ) during gastric distension . CONCLUSION CCK-A receptors are involved in the generation of dyspeptic symptoms by duodenal lipid during gastric distension One hundred and twenty consecutive out patients with non-ulcer dyspepsia ( NUD ) and erosive prepyloric changes ( EPC ) were , after a 2-week placebo run-in period , r and omly allocated to double-blind treatment with either 10-mg cisapride tablets or placebo three times daily for 4 weeks . The patients ' global evaluation and total symptom score were significantly in favour of cisapride at 2 weeks ( p less than 0.05 ) . At 4 weeks the effect of cisapride was no longer significant ( p = 0.22 ) . Similarly , the investigators ' global evaluation showed marked to moderate symptom improvement in 47 % of the cisapride-treated patients as compared with 30 % of the placebo-treated patients at 2 weeks . The 95 % confidence interval of the difference ( 18 % ) was 0 % to 35 % . At 4 weeks the intergroup difference was only 10 % ( cisapride , 50 % versus placebo 40 % ) . Pain on awakening was the only symptom improved in favour of cisapride at 4 weeks . Thus , when patients with NUD and EPC are treated with cisapride , the therapeutic gain might vanish after the 2nd week of treatment The symptoms of functional dyspepsia are still unexplained . To evaluate the possible role of abnormal visceral perception , we studied the symptomatic responses and the pressure variations during progressive gastric distension in 10 female healthy control subjects ( mean age 33.6 years ) and in 10 female patients with functional dyspepsia ( mean age 35.2 years ) . A rubber balloon was positioned 4 cm below the lower esophageal sphincter ( LES ) and inflated with progressively larger volumes of air by steps of 50 ml ; pressures at the gastric fundus and at the LES were continuously recorded by perfused manometric catheters . Each subject was studied on two separate occasions after r and omized double-blind administration of either placebo or 20 mg of domperidone . Symptomatic responses and the manometric data were analyzed at the time of the initial recognition of distension ( bloating step ) and at the time of reporting pain or up to a maximum of 700 ml of balloon inflation ( pain or 700-ml step ) . On placebo , the volumes of gastric distension were more than two times lower in patients than in control subjects at the bloating step ( 185±32 ml vs 470±40 ml , P=0.001 ) and at the pain or 700-ml step ( 265±54 ml vs 600±34 ml , P<0.005 ) , while the pressure gradients ( pressure at inflation steps minus baseline pressure before beginning inflation ) were not statistically different between the two groups . On domperidone , the volumes at each of the two steps did not change in comparison to results on placebo except in healthy controls at the bloating step ( 470±40 ml on placebo vs 355±35 ml on domperidone , P<0.001 ) ; however , there was a trend for pressure gradients to increase on domperidone in comparison to results on placebo . We conclude that patients with functional dyspepsia have a lower threshold both to the initial symptomatic recognition and to perception of pain during gastric distension and that domperidone might have an effect on the threshold of these conscious visceral sensations . This increased visceral perception may alone or with other abnormalities of the gastroduodenal tract explain the symptoms of functional dyspepsia Healthy aging is associated with reductions in appetite and food intake -- the so-called anorexia of aging , which may predispose to protein-energy malnutrition . One possible cause of the anorexia of aging is an increased satiating effect of cholecystokinin ( CCK ) . To investigate the impact of aging on the satiating effects of CCK , 12 young and 12 older healthy subjects received 25-min iv infusions of saline ( control ) and CCK-8 , 1 ng/kg per min or 3 ng/k per min , on 3 separate days before a test meal . Older subjects ate less than young subjects , and food intake was suppressed 21.6 % by CCK-8 , compared with the control day ( P < 0.05 ) . The suppression of energy intake by CCK-8 in older subjects was twice that in young subjects ( 32 + /- 6 % vs. 16 + /- 6 % SEM , P < 0.05 ) and was related to plasma CCK-8 concentrations , which were higher at baseline ( P < 0.05 ) and increased more during CCK-8 infusions in older than young subjects ( P < 0.01 ) . The extent of suppression of food intake per given rise in plasma CCK-8 concentrations did not differ between the two age groups ( P = 0.35 ) . Endogenous CCK concentrations were higher at baseline in older subjects ( P < 0.001 ) and decreased during the CCK-8 but not control infusions ( P < 0.01 ) , suggesting that CCK suppresses its own release . Plasma leptin concentrations were not affected by CCK infusion , whereas postpr and ial insulin concentrations were lowered and the peak postpr and ial glucose concentration was delayed but not affected by CCK-8 infusion . Because older people retain their sensitivity to the satiating effects of exogenous CCK and plasma endogenous CCK concentrations are higher in older people , increased CCK activity may contribute to the anorexia of aging OBJECTIVE : It remains unclear whether postpr and ial symptom profiles in patients with visceral hypersensitivity and in those with impaired fundic accommodation differ . Therefore , we evaluated the postpr and ial symptoms in functional dyspepsia ( FD ) patients classified according to proximal stomach function . In addition , the effect of gastric relaxation induced by sumatriptan on postpr and ial symptoms was studied in FD patients with impaired fundic accommodation . METHODS : Twenty-five healthy volunteers ( HVs ) and 44 FD patients filled out a disease-specific question naire ( Nepean Dyspepsia Index ) and underwent a gastric barostat study to evaluate visceral sensitivity , meal-induced fundic relaxation , and postpr and ial symptoms . Postpr and ial symptoms evoked by a drink test or reported during the barostat study were compared between FD patients subdivided according to the underlying pathophysiological mechanism . Finally , the effect of sumatriptan on postpr and ial symptoms evoked by a drink test was investigated in HVs and in FD patients with impaired fundic accommodation . RESULTS : There was no clear relationship between any of the 15 Nepean Dyspepsia Index symptoms and proximal stomach function . Postpr and ial symptoms evoked during the barostat study or after the drink tests were significantly higher in FD patients than in HVs ; however , no clear differences in symptom profile could be demonstrated between the different subclasses of FD . Sumatriptan did not affect the maximal ingested volume or the postpr and ial symptoms in HVs or FD patients after a drink test . CONCLUSIONS : No clear relationship could be demonstrated between postpr and ial symptoms and proximal stomach function Background : Functional dyspepsia is recognized as a common disorder in clinical practice . The aim of this study was to determine the efficacy and adverse effects of cisapride compared to a placebo in patients from a general practice with functional dyspepsia ( FD ) . Secondly we investigated whether Helicobacter pylori‐positive FD patients present with specific symptoms and determined the efficacy of cisapride for FD patients with H. pylori BACKGROUND The impact of response to treatment on subsequent symptoms , quality of life , health care consumption , and absence from work in functional dyspepsia is unknown . METHODS Patients with functional dyspepsia from Denmark , France , Germany , The Netherl and s , Hungary , and Pol and ( n = 567 ( 215 men ) , 18 - 80 years old ) were followed up for 3 months after a 4-week treatment trial with omeprazole ( 20 mg or 10 mg ) or placebo . The patients were blinded to the initial treatment . Dyspeptic symptoms and quality of life were assessed , and dyspepsia-related costs were calculated in terms of number of clinic visits , days on medication , and absence from work . RESULTS Responders had fewer clinic visits than non-responders ( 1.5 versus 2.0 mean visits ) and fewer days on medication ( mean , 9 days versus 23 days ) over the 3-month period ( both , P < 0.001 ) . The quality of life in responders was better at study entry and persisted over 3 months ( all , P < 0.001 ) . When analysed country by country , health care costs due to clinic visits and medications were significantly lower in responders in all countries ( P < 0.05 ) , except Denmark and The Netherl and s. CONCLUSION Symptom resolution in patients with functional dyspepsia has a positive impact on quality of life and reduces the subsequent costs over a 3-month period after cessation of initial treatment The effect of ingestion of 50 g fat , 50 g protein , and 50 g starch on plasma cholecystokinin ( CCK ) concentrations was studied in eight healthy volunteers . Plasma CCK concentrations were measured by radioimmunoassay with Bolton-Hunter-labelled CCK 33 , CCK 33 st and ard , and antibody T204 . Antibody T204 was directed to the sulphated tyrosine region of CCK . Ingestion of fat and protein induced significant increases in plasma CCK , whereas ingestion of starch did not significantly influence plasma CCK levels . Peak increments in plasma CCK after fat ( 4.8 + /- 0.9 pmol/l ) and protein ( 3.4 + /- 0.5 pmol/l ) were significantly greater than that after starch ( 0.9 + /- 0.3 pmol/l ) . Similarly , the integrated plasma CCK secretion after fat ( 213 + /- 49 pmol/l X 120 min ) and after protein ( 178 + /- 53 pmol/l X 120 min ) was significantly greater than that found after ingestion of starch ( 9 + /- 23 pmol/l X 120 min ) . It is concluded that , in contrast to starch , fat and protein are potent stimuli for the release of CCK Gastric sensation and accommodation are studied by barostat , but this is invasive . The drink test is noninvasive and may provide similar information . We evaluated relationships between drink test , gastric function , symptoms , and psychiatric distress . Controls ( 73 ) and functional dyspeptics ( FD ) ( 92 ) were studied using a 5-min water load test ( WL5 ) , gastric emptying , and electrogastrography ( EGG ) . Symptoms , quality of life , and psychiatric distress were measured using st and ardized measures . Controls underwent test-retest of WL5 and comparison of WL5 with 100 ml/min water-based drink test ( WL100 ) or nutrient drink . Controls , FD , and gastroparetics estimated drinking capacity before WL5 using a visual analog scale . WL5 correlated with WL100 ( r = 0.7929 ) but not nutrient drink test ( r = 0.1995 ) . WL5 was significantly less in FD than controls , and abnormal WL5 was seen in 46 % . In FD , volume to fullness inversely correlated with symptom severity ( r = -0.29 ; P = 0.0154 ) and WL5 produced more symptoms , particularly nausea . Gastric function was not different between FD with normal or abnormal WL5 . Symptoms and psychiatric distress were similar between normal and abnormal WL5 groups , but the abnormal group had significantly poorer quality of life . Controls and gastroparetics had good correlation of estimated and ingested volumes , but FD did not . Versus FD with normal WL5 capacity , FD with impaired drinking capacity have normal gastric function and similar symptoms but poorer quality of life . FD are less able to predict drinking capacity . These data suggest that WL5 identifies FD with intact gastric function but abnormal visceral perception BACKGROUND / AIMS We have previously shown that patients with functional dyspepsia are hypersensitive to gastric distention . The aim of this study was to establish whether this sensory disturbance was confined to the stomach and whether it was associated with gut reflex dysfunction . METHODS In 10 selected patients with dyspepsia and 12 healthy controls , perception and gut reflex responses to gastric distention , duodenal distention , and somatic stimulation were measured . St and ardized distentions at fixed pressures were performed by gastric and duodenal barostats . Perception was scored by a detailed symptom question naire ; gut reflex responses were measured as isobaric volume changes by each barostat . Somatic transcutaneous electrical nerve stimulation was produced on the h and . Individual stimuli ( 2-minute duration ) were r and omly applied at 10-minute intervals in stepwise increments in search of the respective threshold for discomfort . RESULTS Patients with dyspepsia had gastric hypersensitivity to distention ( discomfort threshold at 6.4 + /- 0.4 mm Hg vs. 8.3 + /- 0.6 mm Hg in controls ; mean + /- SE ; P < 0.05 ) , whereas duodenal and somatic sensitivity was normal . Furthermore , patients with dyspepsia explicitly recognized their clinical symptoms in all gastric but only in 58 % + /- 12 % of the duodenal distention trials . In addition , patients with dyspepsia showed defective gastric relaxatory responses to duodenal distention ( 68 + /- 30 mL gastric expansion vs. 239 + /- 12 mL in controls ; P < 0.05 ) . CONCLUSIONS Patients with dyspepsia are selectively hypersensitive to gastric distention ; this sensory dysfunction is associated with impaired reflex reactivity of the stomach Intraduodenal lipid infusion induces symptoms and increases sensitivity to gastric distension in patients with functional dyspepsia . To test whether these effects are specific for lipid , we compared the effects of intraduodenal infusions of either lipid or glucose on symptoms and gastric sensory and motor responses to gastric distension . Eighteen dyspeptic patients and nine controls were studied . The stomach was distended with a flaccid bag during isocaloric infusions ( 1 kcal/ml ) of saline and either 10 % Intralipid ( nine patients ) or 26.7 % glucose ( nine patients ) into the duodenum . Dyspeptic symptoms and sensory thresholds for epigastric fullness and discomfort were assessed . Gastric pressure profiles during distensions were similar during lipid and glucose infusions in patients and controls , but both were significantly lower than during saline infusion . Lower volumes were required to induce fullness and discomfort in the patients compared with the controls . In the controls , the threshold volumes required to induce fullness and discomfort were greater during infusion of lipid and glucose than during saline infusion , but in the patients , the threshold volumes were increased during glucose infusion but further reduced during lipid infusion . Moreover , in the patients , nausea was more common during lipid than glucose infusion and did not occur during saline . The controls did not experience any symptoms during any infusion . In conclusion , intraduodenal lipid but not glucose sensitizes the stomach to distension in patients with functional dyspepsia but not in controls Gastric motor dysfunction and concomitant gastric stasis have been implicated in the pathogenesis of nonulcer dyspepsia , but a cause- and -effect relationship is not established . Essential dyspepsia refers to a subgroup of nonulcer dyspepsia patients who have no evidence of irritable bowel syndrome , gastroesophageal reflux , or pancreaticobiliary disease . In 32 patients with essential dyspepsia , and 32 r and omly selected dyspepsia-free community controls of similar age and sex , we measured gastric emptying of solids using Tc99m-Sulphur Colloid in a fried egg s and wich . Subjects with neuromuscular or other diseases that may alter gastric emptying were excluded . Symptoms were assessed by a st and ard question naire . Data processing was carried out " blinded " to the subjects ' clinical status . Female patients took significantly longer to empty half the initial stomach activity ( mean 90 min ) than female controls ( mean , 73 min ; p = 0.02 ) . The rate of emptying at 25 min was also significantly less in female patients than in controls . Female and male controls , and male patients , had similar emptying times . Delayed emptying was not associated with the occurrence of postpr and ial pain , belching , or nausea ; there was a trend for the half-time rate of emptying to be greater in patients with abdominal distention . While gastric emptying of solids is slightly delayed in females with essential dyspepsia as a group , this may not explain their symptoms Motilin‐receptor agonists are prokinetics ; whether they relieve the symptoms of functional dyspepsia is unknown . We aim ed to test the efficacy of the motilin agonist ABT‐229 in functional dyspepsia patients with and without delayed gastric emptying BACKGROUND AND AIMS It is unclear whether fat digestion is required for the induction of gastrointestinal sensations and whether different fats have different effects . We investigated the effect of fat digestion and of medium-chain triglycerides ( MCTs ; C < 12 ) and long-chain triglycerides ( LCTs ; C > 16 ) on gastrointestinal sensations . METHODS In a double-blind study , 15 healthy subjects were studied on 5 occasions during which LCT or MCT emulsions ( 2 kcal/min ) , with or without 120 mg tetrahydrolipstatin ( THL , lipase inhibitor ) , or sucrose polyester ( SPE , nondigestible fat ) were infused intraduodenally in r and omized order . After 30 minutes , the proximal stomach was distended in 1 mm Hg steps/min . Intensity of gastrointestinal sensations ( on a 0 - 10 visual analog scale ) , plasma cholecystokinin ( CCK ) levels , and gastric volumes were assessed throughout . RESULTS LCT and MCT increased gastric volume at baseline pressure compared with SPE , and LCT more than MCT . THL entirely abolished this effect ( volumes [ mL ] : LCT , 213 + /- 19 ; LCT-THL , 39 + /- 3 ; MCT , 155 + /- 12 ; MCT-THL , 43 + /- 5 ; SPE , 44 + /- 5 ) . Only LCT increased plasma CCK levels ( pmol/L per 30 minutes : LCT , 21 + /- 2 ; LCT-THL , 9 + /- 1 ; MCT , 9 + /- 1 ; MCT-THL , 11 + /- 1 ; SPE , 9 + /- 1 ) . During distentions , intragastric volumes were greater during infusion of LCT and MCT than during the respective THL conditions or SPE , but plasma CCK levels did not change . The intensity of sensations increased ( hunger decreased ) more with LCT than with MCT . During infusion of THL or SPE , the effects were smaller than during LCT or MCT . CONCLUSIONS Fat digestion is required for the modulation of gastrointestinal sensations during gastric distention . The effects of fat depend on the fatty acid chain length and are not entirely explained by release of CCK We r and omly assigned 159 patients with non-ulcer dyspepsia , defined as chronic or recurrent epigastric pain without concomitant symptoms of the irritable bowel syndrome and with no evidence of organic disease , to treatment for three weeks with an antacid suspension one and three hours after meals , 400 mg of cimetidine twice a day , or placebo , according to a double-blind , double-dummy model . The intensity and duration of epigastric pain were recorded by the patients four times daily during a one-week period without therapy and during the three weeks of treatment . The mean reduction in pain intensity after three weeks in the placebo group was 25 percent . Neither antacid nor cimetidine treatment result ed in more than a 4 percent better effect . The reduction of pain was statistically significant ( P less than 0.01 ) in all three groups . The time course of the pain scores in the groups receiving active drugs followed closely those in the placebo group , and there were no significant differences between the groups at any stage of the treatment . We conclude that the neutralization or suppression of gastric acid is of no clinical value in patients with this syndrome Fasting antral area was examined by ultrasonography in 40 healthy subjects and in 106 patients with non-ulcer dyspepsia ( NUD ) and erosive prepyloric changes ( EPC ) before and after treatment with cisapride or placebo . The patients were examined twice , first after a run-in period of 14 days of placebo and then after 14 days of cisapride , 10 mg three times daily , or placebo . The relaxed width of the antral area was measured in two sections : a vertical section in which the antrum , the superior mesenteric vein , and the aorta were visualized simultaneously , and a horizontal section that included the pylorus and the middle of the antrum up to 5 cm proximal to the pylorus . The mean antral area was wider ( p less than 0.001 ) , both in vertical and horizontal sections , in patients with NUD and EPC than in controls . The antral area in NUD patients was wider ( p less than 0.05 ) in smokers than in non-smokers . The area tended to decrease during treatment with cisapride ( p = 0.08 ) . Bloating was the only symptom significantly associated with a wide antral area ( p = 0.01 ) . The results suggest a relationship between a wide fasting antral area and NUD with EPC The relative contributions of altered gastric motor function and Helicobacter pylori-associated active chronic gastritis to the pathogenesis of functional dyspepsia are controversial . We therefore evaluated scintigraphically the intragastric distribution and gastric emptying of a mixed solid-liquid meal in 75 patients with functional dyspepsia ; patients were subdivided on the basis of both specific symptom clusters and the presence or absence ofH. pylori gastritis . Twenty-one ( 28 % ) patients displayed abnormal solid and /or liquid gastric emptying , with prolonged solid lag time the most prominent alteration detected . The number of patients with abnormal scintigraphic patterns increased to 36 ( 48 % ) when intragastric distribution parameters ( fundal half-emptying time and antral maximal fraction ) were examined . Although patients with reflux-like dyspepsia ( N=36 ) demonstrated significantly slower rates of liquid emptying at 45 and 70 min and a higher prevalence of abnormal liquid intragastric distribution when compared to patients with motility-like dyspepsia ( N=39 ) or to controls ( N-34 ) , the absolute differences were small and unlikely to be of clinical significance . Patients withoutH. pylori gastritis ( N=50 ) demonstrated a significantly more prolonged solid lag time when compared to those withH. pylori gastritis ( N=25 ) , but the difference was small and there were no other differences between these two subgroups . We conclude that in patients with functional dyspepsia : ( 1 ) abnormal solid gastric emptying is present in less than one third ; ( 2 ) assessment of parameters of intragastric distribution enables more subtle gastric motor dysfunction to be identified ; and ( 3 ) neither dividing patients into symptom subgroups nor accounting for the presence or absence ofH. pylori gastritis has a major influence on the prevalence or type of gastric motor dysfunction BACKGROUND & AIMS Disturbed gastric accommodation and sensation contribute to postpr and ial symptoms in dyspepsia , but the controlling mechanisms are unclear . Nitrergic and alpha2-adrenergic modulation of gastric sensory and motor function were assessed in this study . METHODS Using a factorial design , we assessed drug effects on gastric sensation during isobaric distentions and fasting and postpr and ial gastric motor function in 32 healthy volunteers . Each participant received one treatment : placebo ; 0.3 or 0.5 microgram . kg-1 . min-1 intravenous nitroglycerin ; 0.0125 , 0.025 , or 0.1 mg clonidine orally ; or combined nitroglycerin plus clonidine . In 16 other healthy subjects , the effects of clonidine and placebo on gastric emptying of solids were evaluated using the 13C-octanoic acid breath test . RESULTS Clonidine and nitroglycerin increased gastric compliance , but normal postpr and ial accommodation was still observed despite the induced relaxation . Clonidine but not nitroglycerin reduced aggregate and pain perception averaged over four distention levels . There were no significant drug interactions . No dose effect of clonidine was observed on gastric emptying . CONCLUSIONS Clonidine relaxes the stomach and reduces gastric sensation without inhibiting accommodation or emptying . Nitroglycerin relaxes the stomach without altering perception . Studies of the effects of clonidine on these gastric functions and symptoms in disease are warranted Paired studies were carried out on 12 healthy male subjects to compare the effect of intravenous doses of the 5‐HT1‐like agonist sumatriptan ( GR43175 ; 3 mg ) , 10 mg metoclopramide and saline control on the rate of gastric emptying of a radiolabelled liquid test‐meal . Intravenous administration of metoclopramide accelerated gastric emptying by decreasing the lag period , while intravenous administration of sumatriptan delayed gastric emptying by increasing the lag period . The observation that sumatriptan causes a delay in gastric emptying in normal healthy volunteers , but relieves nausea and vomiting during migraine attacks , suggests that sumatriptan may be acting via a central mechanism to relieve symptoms of nausea and vomiting associated with migraine BACKGROUND By means of duplex sonography , gastric emptying can be related to antral motor activity . The aim of this study was to examine gastric emptying in relation to antral contractions during and immediately after ingestion of a liquid meal in healthy subjects and to study the effect of glyceryl trinitrate ( GTN ) on this early phase of gastric emptying . METHODS Ten healthy , non-smoking men ( median age , 36 years ; range , 29 - 41 years ) were studied twice on separate days , once without drug administration and once after taking a 0.5-mg sublingual GTN tablet 3 min before ingesting 500 ml of a meat soup ( 20 kcal ; Toro ) . The subjects were investigated during 3 min of fasting , during 3 min of drinking the soup , and during the first 10 min postpr and ially . RESULTS Transpyloric forward flow commenced on average 80 sec and 95 sec after the start of drinking the soup without and with GTN , respectively ( P = NS ) . Non-contractile , pulsatile transpyloric flow ( that is , pendulating , transpyloric flow not associated with antral contractions ) occurred during episodes of concurrent relaxation of the terminal antrum , the pylorus , and the duodenal bulb . This type of flow occurred mainly just before the start of contractile , pulsatile transpyloric flow ( associated with propulsive antral contractions ) . Initial non-contractile , pulsatile transpyloric flow before commencement of contractile , pulsatile transpyloric flow lasted longer with GTN ( 188 sec ) than without GTN ( 25 sec ) ( P < 0.05 ) . Consequently , contractile , pulsatile transpyloric flow commenced later with GTN ( 302 sec ) than without ( 102 sec ) ( P < 0.05 ) . CONCLUSIONS Non-contractile transpyloric flow seems to be a physiologic phenomenon during the early phase of gastric emptying . GTN prolongs the initial phase of non-contractile , and delays the onset of contractile , pulsatile transpyloric flow Delayed gastric emptying , impaired gastric accommodation to a meal and hypersensitivity to gastric distension have been implied in the pathophysiology of functional dyspepsia . Dyspeptic patients are often treated with the prokinetic drug cisapride
11,035
18,829,599
Study findings suggest that the favorable physiological responses to exercise might slow some of the pathophysiological progression of HF .
The incidence of heart failure ( HF ) is increasing as the population ages . Pharmacotherapy is an important component of treatment and yields significant improvements in survival and quality of life . In recent decades , exercise has gradually become accepted as an intervention beneficial to patients with HF , but more information is needed to clarify the effects of exercise and optimize interventions .
AIMS Recent guidelines recommend regular exercise in the management of patients with chronic heart failure ( CHF ) . This study was design ed to compare the safety and efficacy of conventional bicycle exercise and functional electrical stimulation ( FES ) of the legs as forms of home-based exercise training for patients with stable CHF . METHODS AND RESULTS Forty-six patients ( 38 male ) with stable NYHA Class II/III heart failure underwent a 6-week training programme using either a bicycle ergometer or electrical stimulation of the quadriceps and gastrocnemius muscles . In the bike group , significant increases were seen in 6-min walk ( 44.6 m , 95 % confidence interval ( CI ) 29.3 - 60.9 m ) , treadmill exercise time ( 110 s , 95 % CI 72.2 - 148.0 s ) , maximum leg strength ( 5.32 kg , 95 % CI 3.18 - 7.45 kg ) , and quadriceps fatigue index ( 0.08 , 95 % CI 0.04 - 0.12 ) following training . In the stimulator group , similar significant increases were seen following training for 6-min walk ( 40.6 m , 95 % CI 28.2 - 53.0 m ) , treadmill exercise time ( 67 s , 95 % CI 11.8 - 121.8s ) , maximum leg strength ( 5.35 kg , 95 % CI 1.53 - 9.17 kg ) , and quadriceps fatigue index ( 0.10 , 95 % CI 0.04 - 0.17 ) . Peak VO(2)did not change in either group following training , indicating a low-intensity regime . Quality of life scores improved following training when the bicycle and stimulator groups were considered together , but not when considered separately ( -0.43 , 95 % CI -8.13 to -0.56 ) . CONCLUSIONS FES produces beneficial changes in muscle performance and exercise capacity in patients with CHF . Within this study , the benefits were similar to those observed following bicycle training . FES could be offered to patients with heart failure as an alternative to bicycle training as part of a home-based rehabilitation programme Systemic arterial compliance ( SAC ) makes an important contribution to cardiac afterload , and thus is a significant determinant of left ventricular work . Previous studies have suggested that arterial compliance may be reduced in patients with congestive heart failure ( CHF ) , and that SAC is increased after a 4-week exercise training programme in healthy , sedentary individuals . The present study aim ed to investigate the effects of an 8-week exercise training programme on arterial mechanical properties , left ventricular performance and quality of life in CHF patients . A total of 21 patients with NYHA class II or III CHF ( mean+/-S.D. age 55+/-13 years ) were r and omly allocated to either an 8-week exercise training group or a " usual lifestyle " control group . SAC , as determined non-invasively using applanation tonometry and Doppler aortic velocimetry , increased from 0.57+/-0.11 to 0.77+/-0.14 arbitrary compliance units ( mean+/-S.E.M. ; P=0.01 ) in the exercise group , while no change occurred in the control group . Left ventricular structure and function was assessed by echocardiography , and these parameters were unchanged over the 8-week study period . Exercise training significantly increased exercise capacity , measured by a 6-min walking test ( 474+/-27 to 547+/-34 m ; P=0.008 ) . Quality of life , as assessed using the Minnesota Living with Heart Failure Evaluation , demonstrated a decrease in heart failure symptoms from 46+/-7 to 24+/-5 units ( P=0.01 ) following the exercise training programme . These data show that exercise training improves SAC in patients with CHF . The accompanying improvement in exercise capacity may be due , in part , to an improvement in arterial function AIM To assess changes in quality of life ( QoL ) and oxygen consumption produced by two different patterns of physical training in patients with congestive heart failure ( CHF ) . MATERIAL AND METHODS 42 men ( mean age 55.9+/-8.1 years ) with ischaemic CHF lasting 3.1+/-1.0 years . Patients were r and omised into three groups each consisting of 14 men : group A -- with constant workload , group B-with progressive/increasing workload , each trained up to 6 months and group C -- not trained . QoL was assessed at baseline and at 6 months by means of the Psychological General Well-being Index ( PGWB ) and the Subjective Symptoms Assessment Profile ( SSA-P ) . Cardiopulmonary exercise test and echocardiography were performed twice . RESULTS At 6 months improvement in PGWB total index was observed , both in groups A and B ( p<0.01 ) . Men from groups A and B reported less cardiac symptoms ( p<0.01 ) , emotional distress ( p<0.01 ) , peripheral circulatory symptoms ( p<0.01 ) and dizziness ( p<0.01 ) in SSA-P. Improvement in sexual life was observed only in group B ( p<0.01 ) . Overall improvement of QoL was greater in group B than in group A as well as oxygen uptake ( p<0.01 ) . Higher QoL correlated positively with peak VO2 only in group B ( r=0.56 , p<0.05 ) . CONCLUSIONS Physical training improves QoL in men with CHF , but only progressive/increasing workload seems to markedly improve oxygen uptake . Improvement of QoL is related to psychological well-being and physical complaints associated with CHF Background Heart rate recovery ( HRR1 ) immediately after exercise reflects parasympathetic activity , which is markedly attenuated in chronic heart failure ( CHF ) patients . The aim of our study was to examine both continuous and interval exercise training effects on HRR1 in these patients . Design The population study consisted of 29 stable CHF patients that participated at a rehabilitation program of 36 sessions , three times per week . Of the 29 patients , 24 completed the program . Patients were r and omly assigned to interval { n = 10 [ 100 % peak work rate ( WRp ) for 30 s , alternating with rest for 30 s ] } and to continuous training [ n = 14 ( 50%WRp ) ] . Methods All patients performed a symptom-limited cardiopulmonary exercise test on a cycle ergometer before and after the completion of the program . Measurements included peak oxygen uptake ( VO2p ) , anaerobic threshold ( AT ) , WRp , first degree slope of VO2 during the first minute of recovery ( VO2/t-slope ) , chronotropic response [ % chronotropic reserve ( CR ) = ( peak HR - resting HR ) × 100/(220 - age - resting HR ) ] , HRR1 ( HR difference from peak exercise to one minute after ) . Results After the completion of the rehabilitation program there was a significant increase of WRp , VO2p , AT and VO2/t-slope ( by 30 % , P=0.01 ; 6 % , P=0.01 ; 10 % , P=0.02 ; and 27 % , P=0.03 respectively for continuous training and by 21 % , P≤0.05 ; 8 % , P=0.01 ; 6 % , P = NS ; and 48 % , P=0.02 respectively for interval training ) . However , only patients exercised under the continuous training regime had a significant increase in HRR1 ( 15.0±9.0 to 24.0±12bpm ; P=0.02 ) and CR ( 57±19 to 72±21 % , P=0.02 ) , in contrast with those assigned to interval training ( HRR1 : 21 ± 11 to 21 ± 8 bpm ; P = NS and CR : 57 ± 18 to 59 ± 21 % , P = NS ) . Conclusions Both continuous and interval exercise training program improves exercise capacity in CHF patients . However , continuous rather than interval exercise training improves early HRR1 , a marker of parasympathetic activity , suggesting a greater contribution to the autonomic nervous system AIMS Hydrotherapy , i.e. exercise in warm water , as a rehabilitation program has been considered potentially dangerous in patients with chronic heart failure ( CHF ) due to the increased venous return caused by the hydrostatic pressure . However , hydrotherapy has advantages compared to conventional training . We studied the applicability of an exercise programme in a temperature-controlled swimming pool , with specific reference to exercise capacity , muscle function , quality of life and safety . METHODS AND RESULTS Twenty-five patients with CHF ( NYHA II-III , age 72.1+/-6.1 ) were r and omised into either 8 weeks of hydrotherapy ( n=15 ) , or into a control group ( n=10 ) . The training program was well tolerated with no adverse events . Patients in the hydrotherapy group improved their maximal exercise capacity ( + 6.5 vs.-5.9 W , P=0.001 ) , isometric endurance in knee extension ( + 4 vs.-9 s , P=0.01 ) together with an improvement in the performance of heel-lift ( + 4 vs. -3 n.o . , P=<0.01 ) , shoulder abduction ( + 12 vs. -8 s , P=0.01 ) and shoulder flexion ( + 6 vs. + 4 , P=0.01 ) in comparison to patients in the control group . CONCLUSION Physical training in warm water was well tolerated and seems to improve exercise capacity as well as muscle function in small muscle groups in patients with CHF . This new approach broadens the variety of training regimes for older patients with CHF BACKGROUND Heart failure , a condition predominantly affecting the elderly , represents an ever-increasing clinical and financial burden for the NHS . Cardiac rehabilitation , a service that incorporates patient education , exercise training and lifestyle modification , requires further evaluation in heart failure management . AIM The aim of this study was to determine whether a cardiac rehabilitation programme improved on the outcomes of an outpatient heart failure clinic ( st and ard care ) for patients , over 60 years of age , with chronic heart failure . METHODS Two hundred patients ( 60 - 89 years , 66 % male ) with New York Heart Association ( NYHA ) II or III heart failure confirmed by echocardiography were r and omised . Both st and ard care and experimental groups attended clinic with a cardiologist and specialist nurse every 8 weeks . Interventions included exercise prescription , education , dietetics , occupational therapy and psychosocial counselling . The main outcome measures were functional status ( NYHA , 6-min walk ) , health-related quality of life ( MLHF and EuroQol ) and hospital admissions . RESULTS There were significant improvements in MLHF and EuroQol scores , NYHA classification and 6-min walking distance ( meters ) at 24 weeks between the groups ( p<0.001 ) . The experimental group had fewer admissions ( 11 vs. 33 , p<0.01 ) and spent fewer days in hospital ( 41 vs. 187 , p<0.001 ) . CONCLUSIONS Cardiac rehabilitation , already widely established in the UK , offers an effective model of care for older patients with heart failure AIMS Patients with chronic heart failure ( CHF ) exhibit detrimental changes in skeletal muscle that contribute to their impaired physical performance . This study investigates the possibility of counteracting these changes by chronic low-frequency electrical stimulation ( CLFS ) of left and right thigh muscles . METHODS AND RESULTS ( mean+/-SD ) 32 CHF patients ( 53+/-10 years ) with an LVEF of 22+/-5 % , NYHA II-IV , undergoing optimized drug therapy , were r and omized in a CLFS group ( CLFSG ) or a control group ( controls ) . The groups differed in terms of the intensity of stimulation , which elicited strong muscle contractions only in the CLFSG , whereas the controls received current input up to the sensory threshold without muscle contractions . Functional capacity was assessed by peak VO(2 ) , work capacity , and a 6-min-walk ( 6-MW ) . Muscle biopsies were analyzed for myosin heavy chain ( MHC ) isoforms , citrate synthase ( CS ) and glyceraldehydephosphate dehydrogenase ( GAPDH ) activities . Peak VO(2)(mlmin(-1)kg -1 ) increased from 9.6+/-3.5 to 11.6+/-2.8 ( P<0.001 ) in the CLFSG , and decreased from 10.6+/-2.8 to 9.4+/-3.2 ( P<0.05 ) in the controls . The increase in the CLFSG was paralleled by increases in maximal workload ( P<0.05 ) and oxygen uptake at the anaerobic threshold ( P<0.01 ) . The corresponding values of the controls were unchanged , as also the 6-MW values , the MHC isoform distribution , and both CS and GAPDH activities . In the CLFSG , the 6-MW values increased ( P<0.001 ) , CS activity was elevated ( P<0.05 ) , GAPDH activity decreased ( P<0.01 ) , and the MHC isoforms were shifted in the slow direction with increases in MHCI at the expense of MHCIId/x ( P<0.01 ) . CONCLUSIONS Our results suggest that CLFS is a suitable treatment to counteract detrimental changes in skeletal muscle and to increase exercise capacity in patients with severe CHF BACKGROUND A proinflammatory state is recognized in chronic heart failure and the degree of immune activation corresponds to disease severity and prognosis . Training is known to improve symptoms in heart failure but less is known about the effects of specific forms of training on the proinflammatory state . METHODS Forty-six patients with stable chronic heart failure underwent a home-based program of exercise training for 30 minutes a day , 5 days per week over a 6-week period . Twenty-four used a bicycle ergometer and 22 used an electrical muscle stimulator applied to quadriceps and gastrocnemius muscles . Tumour necrosis factor-alpha ( TNF-alpha ) , TNF-alpha soluble receptors 1 and 2 , interleukin 6 , and C-reactive protein were measured before and after the training period . RESULTS Significant improvements in markers of exercise performance were seen in both training groups . Soluble TNF-alpha receptor 2 levels decreased after training in the bike group only ( 2900 + /- 1069 pg/mL to 2625 + /- 821 pg/mL , P = .013 ) . Trends towards a decrease in levels of TNF-alpha and soluble receptor 1 were also seen in the bike group only . No change in circulating inflammatory markers was observed after stimulator training . CONCLUSIONS Physical training improves exercise capacity for patients with chronic heart failure but degree of attenuation of the proinflammatory response may depend on the mode of training despite similar improvements in exercise capacity Decreased exercise capacity is the main factor restricting the daily life of patients with chronic congestive heart failure ( CHF ) . We performed a controlled , r and omized study to evaluate the effect of dynamic exercise training of moderate intensity on exercise capacity and gas exchange in patients with CHF . Twenty-seven patients with stable CHF , New York Heart Association ( NYHA ) functional class II and III , were r and omized to training ( n = 12 ) and control ( n = 15 ) groups . During a 3-month period , the training group underwent a supervised physical training program using a bicycle ergometer for 30 min 3 times a week at a load corresponding to 50 to 60 % of their peak oxygen consumption . Thereafter , they were advised to continue training at home for the next 3 months . The control group did not change their previous physical activity . A grade d maximal exercise test with respiratory gas analysis and an endurance test with constant submaximal workload were performed at baseline and after 3 and 6 months . The exercise endurance increased from 14.7 + /- 2.0 to 27.8 + /- 2.7 min ( p < 0.01 ) and the peak oxygen consumption tended to improve from 19.3 + /- 1.6 to 21.7 + /- 2.3 mL/kg/min ( p = 0.09 ) during the supervised training period . At submaximal workloads , minute ventilation was reduced by 16 % per se ( p < 0.01 ) and by 7 % in proportion to carbon dioxide production ( p < 0.05 ) . Oxygen consumption at the anaerobic threshold increased from 10.5 + /- 0.8 to 12.7 + /- 1.0 mL/kg/min ( p < 0.05 ) . The positive training effects were associated with an improvement in the NYHA functional class . The effects of supervised training were preserved during the home-based training period . The results indicate that physical training of moderate intensity significantly improves the exercise capacity and reduces the exaggerated ventilatory response to exercise , particularly at submaximal working levels in patients with CHF . This is associated with alleviation of symptoms OBJECTIVES The aim of this study was to assess the effects of regular physical exercise on local inflammatory parameters in the skeletal muscle of patients with chronic heart failure ( CHF ) . BACKGROUND Inflammatory activation with increased serum cytokine levels and expression of inducible nitric oxide synthase ( iNOS ) in the myocardium and peripheral skeletal muscles has been described in CHF . METHODS Twenty male patients with stable CHF ( left ventricular ejection fraction 25 + /- 2 % ; age 54 + /- 2 years ) were r and omized to a training group ( n = 10 ) or a control group ( n = 10 ) . At baseline and after six months , serum sample s and vastus lateralis muscle biopsies were obtained . Serum tumor necrosis factor (TNF)-alpha , interleukin (IL)-6 , and IL-1-beta levels were measured by enzyme-linked immunosorbent assay , local cytokine , and iNOS expression by real-time polymerase chain reaction . RESULTS Exercise training improved peak oxygen uptake by 29 % in the training group ( from 20.3 + /- 1.0 to 26.1 + /- 1.5 ml/kg . min ; p < 0.001 vs. control group ) . While serum levels of TNF-alpha , IL-6 , and IL-1-beta remained unaffected by training , local skeletal muscle TNF-alpha decreased from 1.9 + /- 0.4 to 1.2 + /- 0.3 relative U ( p < 0.05 for change vs. control group ) , IL-6 from 71.3 + /- 16.5 to 41.3 + /- 8.8 relative U ( p < 0.05 vs. begin ) , and IL-1-beta from 2.7 + /- 1.1 to 1.4 + /- 0.6 relative U ( p = 0.02 vs. control group ) . Exercise training also reduced local iNOS expression by 52 % ( from 6.3 + /- 1.2 to 3.0 + /- 1.0 relative U ; p = 0.007 vs. control group ) . CONCLUSIONS Exercise training significantly reduced the local expression of TNF-alpha , IL-1-beta , IL-6 , and iNOS in the skeletal muscle of CHF patients . These local anti-inflammatory effects of exercise may attenuate the catabolic wasting process associated with the progression of CHF PURPOSE The aim of this study was to compare the effects of endurance training alone ( ET ) with combined endurance and strength training ( CT ) on hemodynamic and strength parameters in patients with congestive heart failure ( CHF ) . METHODS Twenty male patients with CHF were r and omized into one of two training regimens consisting of endurance training or a combination of endurance and resistance training . Group ET had 40-min interval cycle ergometer endurance training three times per week . Group CT combined endurance and strength training with the same interval endurance training for 20 min , followed by 20 min of strength training . Left ventricular function was assessed at baseline and after 40 training sessions by echocardiography and radionuclide ventriculography . Work capacity was measured with cardiopulmonary exercise test ( CPX ) and lactate determination . Strength was measured with an isokinetic dynamometer . RESULTS After 40 sessions , the ET group improved functional class , work capacity , peak torque , and muscular endurance . However , peak O2 remained unchanged . Left ventricular ejection fraction ( LVEF ) and fractional shortening ( FS ) decreased , whereas left ventricular end-diastolic diameter ( LVED ) increased . The CT group improved NYHA score , working capacity , peak O2 , and peak lactate ; peak torque and muscular endurance , LVEF , and FS increased , whereas LVED decreased . Compared with ET , CT was significantly ( P < 0.05 ) better in improving LV function . CONCLUSION Combined endurance/strength training was superior to endurance training alone concerning improvement of LV function , peak VO2 , and strength parameters . It appears that for stable CHF patients , a greater benefit can be derived from this training modality BACKGROUND The purpose of this study was to determine the effects of systemic exercise training on endothelium-mediated arteriolar vasodilation of the lower limb and its relation to exercise capacity in chronic heart failure ( CHF ) . Endothelial dysfunction is a key feature of CHF , contributing to increased peripheral vasoconstriction and impaired exercise capacity . Local h and grip exercise has previously been shown to enhance endothelium-dependent vasodilation in conduit and resistance vessels in CHF . METHODS AND RESULTS Twenty patients were prospect ively r and omized to a training group ( n=10 , left ventricular ejection fraction [ LVEF ] 24+/-4 % ) or a control group ( n=10 , LVEF 23+/-3 % ) . At baseline and after 6 months , peak flow velocity was measured in the left femoral artery using a Doppler wire ; vessel diameter was determined by quantitative angiography . Peripheral blood flow was calculated from average peak velocity ( APV ) and arterial cross-sectional area . After exercise training , nitroglycerin-induced endothelium-independent vasodilation remained unaltered ( 271 % versus 281 % , P = NS ) . Peripheral blood flow improved significantly in response to 90 microg/min acetylcholine by 203 % ( from 152+/-79 to 461+/-104 mL/min , P<0.05 versus control group ) and the inhibiting effect of L-NMMA increased by 174 % ( from -46+/-25 to -126+/-19 mL/min , P<0.05 versus control group ) . Peak oxygen uptake increased by 26 % ( P<0.01 versus control group ) . The increase in peak oxygen uptake was correlated with the endothelium-dependent change in peripheral blood flow ( r=0.64 , P<0 . 005 ) . CONCLUSIONS Regular physical exercise improves both basal endothelial nitric oxide ( NO ) formation and agonist-mediated endothelium-dependent vasodilation of the skeletal muscle vasculature in patients with CHF . The correction of endothelium dysfunction is associated with a significant increase in exercise capacity In einer Querschnittsuntersuchung konnte kürzlich eine mäßiggradige Korrelation zwischen Serum-BNP ( brain natriuretic peptide ) und der ergometrischen Leistungsfähigkeit chronisch herzinsuffizienter Patienten gezeigt werden ( Krüger et al. , 2002 ) . Ob BNP , das auf erhöhte myokardiale W and spannung reagiert , i m klinischen Follow-up eine ausreichende Sensitivität für Veränderungen der körperlichen Leistungsfähigkeit aufweist , ist jedoch ungeklärt . Daher wurden 42 chronisch herzinsuffiziente Patienten in Trainings- ( T ; 58 ± 10 Jahre ; n = 14 NYHA II ; n = 5 NYHA III ) und Kontrollgruppe ( KO ; 54 ± 9 , n = 17 NYHA II ; n = 6 NYHA III ) r and omisiert . T absolvierte ein 12-wöchiges fahrradergometrisches Ausdauertraining ( 4X/Wo , 45 min ) . Eine venöse Blutentnahme und eine Fahrrad-Spiroergometrie f and en vor und nach der Experimentalphase statt . Aus Gründen der Messstabilität wurde statt BNP das in äquimolaren Mengen sezernierte und diagnostisch äquivalente NT-proBNP bestimmt . In beiden Gruppen sank das durchschnittliche NT-proBNP geringfügig ( T : von 1092 ± 980 auf 805 ± 724 pg × ml–1 ; KO : von 1075 ± 1068 auf 857 ± 1138 pg × ml–1 ; Gruppendifferenz T vs. KO : p = 0,65 ) . Die anaerobe Schwelle ( AT ) als Maß der Ausdauerleistungsfähigkeit stieg in T ( von 0,96 ± 0,17 auf 1,10 ± 0,22 l × min–1 ) und blieb in KO nahezu konstant ( von 1,02 ± 0,27 auf 1,00 ± 0,27 l × min–1 ; T vs. KO : p < 0,001 ) . Es best and keine Korrelation zwischen den trainingsinduzierten Veränderungen der AT und den Effekten auf NT-proBNP ( r = 0,02 ; p = 0,89 ) , auch nicht bei ausschließlicher Berücksichtigung von T ( r = 0,09 , p = 0,72 ) . Eine durch Ausdauertraining verbesserte körperliche Leistungsfähigkeit herzinsuffizienter Patienten zeigt sich somit nicht i m Verhalten des Serum-NT-proBNP . Die Empfindlichkeit dieses Parameters ist für die Verlaufskontrolle solcher Maßnahmen nicht hinreichend . Möglicherweise sind therapeutische Interventionen , die primär am Myokard angreifen , eher in der Lage , über Spontanschwankungen hinausgehende NT-proBNP-Veränderungen auszulösen . Eine ergometrische Testung der körperlichen Leistungsfähigkeit kann in der Verlaufsbeurteilung chronisch herzinsuffizienter Patienten nicht durch Bestimmungen von NT-proBNP ersetzt werden . Recently , in a cross-sectional study ( Krüger et al. , 2002 ) a correlation of moderate degree was documented between serum BNP ( brain natriuretic peptide ) and exercise capacity in patients with chronic heart failure ( CHF ) . However , it remains unknown if BNP , which increases in response to high myocardial wall stress , is sufficiently sensitive for changes in exercise capacity during clinical follow-up . To eluci date this , 42 CHF patients were recruited and r and omized into a training ( T ; 58 ± 10 years ; n = 14 NYHA II ; n = 5 NYHA III ) and a control group ( CO ; 54 ± 9 , n = 17 NYHA II ; n = 6 NYHA III ) . T carried out 12 weeks of endurance training on a cycle ergometer ( 4 sessions per week , 45 min duration ) . Venous blood sampling and cycle ergometry with simultaneous gas exchange measurements were carried out prior to and after the experimental phase . Due to its superior stability during laboratory procedures , NTproBNP was determined instead of BNP . Both proteins are secreted in equimolar amounts and share an identical diagnostic meaning . In both groups , NT-proBNP decreased slightly ( T : from 1092 ± 980 to 805 ± 724 pg × ml–1 ; CO : from 1075 ± 1068 to 857 ± 1138 pg × ml–1 ; T vs CO : p = 0.65 ) . Anaerobic threshold ( AT ) as a measure of exercise capacity went up in T ( from 0.96 ± 0.17 to 1.10 ± 0.22 l × min–1 ) but remained almost constant in CO ( pre : 1.02 ± 0.27 ; post : 1.00 ± 0.27 l × min–1 ; T vs CO : p < 0.001 ) . The correlation between changes in NT-proBNP and changes in AT remained insignificant ( r = 0.02 , p = 0.89)—even if only T was considered ( r = 0.09 , p = 0.72 ) . Improved exercise capacity in CHF patients due to 3 months of endurance training is not reflected in the course of NT-proBNP . These findings are inconsistent with a sufficient sensitivity of this parameter to detect changes in exercise capacity during clinical follow-up . Changes in NT-proBNP beyond its spontaneous variability are more likely to be detected following therapeutical interventions which aim more clearly at the myocardium . In determining alterations of functional capacity ergometric testing can not be replaced by serial determinations of NT-proBNP BACKGROUND Exercise training in heart failure patients improves exercise capacity , physical function , and quality -of-life . Prior studies indicate a rapid loss of these effects following termination of the training . We wanted to assess any sustained post-training effects on patients global assessment of change in quality -of-life ( PGACQoL ) and physical function . METHODS Fifty-four stable heart failure patients were r and omised to exercise or control . The 4-month exercise programme consisted of bicycle training at 80 % of maximal intensity three times/week , and 49 patients completed the active study period . At 10 months ( 6 months post training ) 37 patients were assessed regarding PGACQoL , habitual physical activity , and dyspnea-fatigue-index . RESULTS Both post-training patients ( n=17 ) and controls ( n=20 ) deteriorated PGACQoL during the 6-month extended follow-up , although insignificantly . However , post-training patients improved PGACQoL slightly but significantly from baseline to 10 months ( P=0.006 ) , differing significantly ( P=0.023 ) from controls who were unchanged . Regarding dyspnea-fatigue-index , post-training patients were largely unchanged and controls deteriorated insignificantly , during the extended follow-up as well as from baseline to 10 months . Both groups decreased physical activity insignificantly during the extended follow-up , and from baseline to 10 months post-training patients tended to decrease whereas controls significantly ( P=0.007 ) decreased physical activity . CONCLUSION There was no important sustained benefit 6 months after termination of an exercise training programme in heart failure patients . A small , probably clinical ly insignificant sustained improvement in PGACQoL was seen in post-training patients . Controls significantly decreased the habitual physical activity over 10 months and post-training patients showed a similar trend . Exercise training obviously has to be continuing to result in sustained benefit OBJECTIVES The aim of this study was to analyze whether L-arginine ( L-arg . ) has comparable or additive effects to physical exercise regarding endothelium-dependent vasodilation in patients with chronic heart failure ( CHF ) . BACKGROUND Endothelial dysfunction in patients with CHF can be corrected by both dietary supplementation with L-arg . and regular physical exercise . METHODS Forty patients with severe CHF ( left ventricular ejection fraction 19 + /- 9 % ) were r and omized to an L-arg . group ( 8 g/day ) , a training group ( T ) with daily h and grip training , L-arg . and T ( L-arg . + T ) or an inactive control group ( C ) . The mean internal radial artery diameter was determined at the beginning and after four weeks in response to brachial arterial administration of acetylcholine ( ACh ) ( 7.5 , 15 , 30 microg/min ) and nitroglycerin ( 0.2 mg/min ) with a transcutaneous high-resolution 10 MHz A-mode echo tracking system coupled with a Doppler device . The power of the study to detect clinical ly significant differences in endothelium-dependent vasodilation was 96.6 % . RESULTS At the beginning , the mean endothelium-dependent vasodilation in response to ACh , 30 microg/min was 2.54 + /- 0.09 % ( p = NS between groups ) . After four weeks , internal radial artery diameter increased by 8.8 + /- 0.9 % after ACh 30 microg/min in L-arg . ( p < 0.001 vs. C ) , by 8.6 + /- 0.9 % in T ( p < 0.001 vs. C ) and by 12.0 + /- 0.3 % in L-arg . + /- T ( p < 0.005 vs. C , L-arg . and T ) . Endothelium-independent vasodilation as assessed by infusion of nitroglycerin was similar in all groups at the beginning and at the end of the study . CONCLUSIONS Dietary supplementation of L-arg . as well as regular physical exercise improved agonist-mediated , endothelium-dependent vasodilation to a similar extent . Both interventions together seem to produce additive effects with respect to endothelium-dependent vasodilation Background Exercise training ( ET ) has been shown to improve functional work capacity in patients with stable chronic heart failure ( CHF ) having moderate symptoms ( NYHA class II ) . This analysis was conducted , to evaluate the effects of ET on left ventricular function and haemodynamics in patients with advanced CHF ( NYHA class III ) fulfilling the inclusion criteria of the COPERNICUS trial . Methods Seventy-three patients with moderate and advanced CHF were prospect ively r and omised to a training ( n = 36 ) , or to a control group ( n = 37 ) . At baseline and after six months , patients underwent echocardiography and symptom-limited ergospirometry with measurement of central haemodynamics by thermodilution . Results Nine out of 37 patients in the control group ( C ) and 10 out of 36 patients in the training group ( T ) had symptoms of advanced CHF . Exercise training over a period of six months result ed in an improvement of functional status on average by one NYHA class in patients with advanced CHF . Moreover , oxygen uptake at the ventilatory threshold increased by 49 % ( from 7.7 ± 1.0 to 11.4 ± 0.4 mL/min/kg , P<0.01 versus baseline ) and at peak exercise by 32 % ( from 16.3 ± 1.6 to 21.5 ± 1.2 mL/min/kg , P<0.01 versus baseline ) in training patients . The small , but significant reduction in left ventricular end-diastolic diameter by 7 % ( from 70 ± 2 to 66 ± 2 mm ; P<0.05 versus baseline ) was accompanied by an augmentation in stroke volume at rest by 32 % ( from 45 ± 3 to 60 ± 6 mL , P<0.05 versus baseline ) and at peak exercise by 27 % ( 63 ± 9 to 81 ±9 mL , P<0.05 versus baseline ) as a result of ET in patients with advanced CHF . Conclusion In patients with advanced CHF ( NYHA class III ) , long-term exercise training is associated with an enhanced physical work capacity , an improvement in stroke volume and a reduction in cardiomegaly Exercise training has recently become an accepted therapeutic modality in chronic heart failure after myocardial infa rct ion . Because the therapeutic mechanism behind it is controversial and not well understood , we analyzed the influence of exercise training on blood viscosity . Twenty-five patients with chronic heart failure ( ejection fraction < 40 % ) after myocardial infa rct ion were r and omly assigned to either an 8-week intensive exercise program at a residential rehabilitation center or 8 weeks of sedentary life at home . Exercise consisted of two 1-hour walking sessions per day and four intensive bicycle ergometer training sessions of 40 minutes at 70 % to 80 % peak exercise capacity per week . Whole blood viscosity , viscosity at st and ardized hematocrit of 45 % ( P45 ) at high and low shear rates , and plasma viscosity were measured in a Couette-type viscometer before , during , and at the end of the study period . Exercise training , which significantly increased maximal cardiac output and oxygen uptake , did not change plasma viscosity , whole blood viscosity , and P45 significantly . Sedentary controls , however , had a higher whole blood viscosity and P45 after 8 weeks . No statistical difference was found , however , between the two groups . We conclude that blood rheology remains unaffected by exercise training in patients with chronic heart failure . The improvement of blood viscosity remains an interesting therapeutic option for the symptoms of these patients , which must be achieved by methods other than exercise training BACKGROUND Exercise capacity of patients with chronic heart failure ( CHF ) correlates poorly with estimates of cardiac function . Yet , it has been suggested that only patients without severely impaired cardiac output ( CO ) benefit from exercise training . Comparisons of different training models have not been made in the same study . AIMS To evaluate whether the response to different training models diverges according to the cardiac output response to exercise in patients with chronic heart failure . METHODS Sixteen CHF patients ( 63 + /- 11 years ) with an ejection fraction of 30 + /- 11 % underwent a baseline cardiopulmonary exercise test , right heart catheterization and leg muscle biopsy . Cardiac output ( CO ) response to exercise was defined as the ratio between CO increase and the increase in oxygen uptake ( CO response index ) during exercise . Patients were r and omized into two training regimens , differing with regard to active muscle mass , i.e. whole body and one-legged exercise . RESULTS Baseline exercise capacity expressed as W kg-1 correlated with the CO response index ( r = 0.51 , P < 0.05 ) . Exercise capacity on the cycle ergometer increased in both groups but more in the one-legged than in the two-legged training group ( P < 0.05 ) . The improvement in exercise capacity did not correlate with base-line exercise capacity . It correlated with CO response index in the one-legged ( r = 0.75 , P < 0.01 ) but not in the two-legged training group . CO response index correlated negatively with the pulmonary capillary wedge pressure at peak exercise ( r = - 0.60 , P < 0.05 ) . The increase in leg muscle citrate synthase activity after training correlated negatively with the baseline CO response index ( r = - 0 . 50 , P < 0.05 ) . CONCLUSIONS The improvement of exercise capacity after one-legged training correlates with the CO increase in relation to the O2 uptake before training . In patients with low CO response , individualization of the exercise regimen is needed and the benefits of training a limited muscle mass at a time deserve further study To determine the effect of training on insulin sensitivity ( IS ) and how this relates to peak V(.)O(2 ) ( peak oxygen uptake ) in CHF ( chronic heart failure ) , 77 CHF patients ( New York Heart Association class , II/III ; men/women , 59/18 ; age , 60+/-9 years ; body mass index , 26.7+/-3.9 kg/m(2 ) ; left ventricular ejection fraction , 26.9+/-8.1 % ; expressed as means+/-S.D. ) participated in the study . Patients were r and omly assigned to a training or control group ( TrG or CG respectively ) . Sixty-one patients completed the study . Patients participated in training ( combined strength and endurance exercises ) four times per week , two times supervised and two times at home . Before and after intervention , anthropometry , IS ( euglycaemic hyperinsulinaemic clamp ) and peak V(.)O(2 ) ( incremental cycle ergometry ) were assessed . Intervention did not affect IS significantly , even though IS increased by 20 % in TrG and 11 % in CG ( not significant ) . Peak V(.)O(2 ) increased as a result of training ( 6 % increase in TrG ; 2 % decrease in CG ; P < 0.05 ) . In both groups ( TrG and CG ) , the change in IS correlated positively with the change in peak V(.)O(2 ) ( r = 0.30 , P < 0.05 ) . Training result ed in an increase in peak V(.)O(2 ) , but not in IS . Whether physical training actually increases IS in CHF patients remains unclear Background : Reduced heart pump function and skeletal muscle abnormalities are considered important determinants for the low physical exercise capacity in chronic heart failure . Because of reduced ventricular function , traditional physical rehabilitation may cause underperfusion and low local work intensity , thereby producing suboptimal conditions for skeletal muscle training OBJECTIVE To evaluate whether a specific program of moderate-intensity step aerobics training may be sufficient to improve the exercise tolerance of patients with severe chronic heart failure . PATIENTS Twenty-six patients ( 22 men , 4 women ; mean + /- SD age , 54 + /- 9yrs ) with a history of severe chronic heart failure ( left ventricular ejection fraction of 18 % + /- 8 % ) . STUDY DESIGN Prospect i ve , r and omized , controlled trial . Patients were r and omized into exercise and control groups . All patients underwent a clinical examination and a ramp pattern cycle exercise test before and after the observation period . The exercise group underwent a moderate-intensity ( 50 % of peak oxygen uptake ) 12-week training program , progressing to 100 minutes per week of step aerobics and 50 minutes per week of cycling . The control group did not perform a training program . MAIN OUTCOME MEASURES Peak oxygen uptake , peak workload , percent of predicted power ability . RESULTS Significant increases in peak oxygen uptake ( 15 + /- 3.4 to 18.5 + /- 2.9mL/kg/min ; p = .001 ) , peak workload ( 77 + /- 26 to 99 + /- 31 watts ; p = .000 ) , and percent of predicted power ability ( 43 % + /- 10 % to 56 % + /- 13 % ; p = .000 ) were observed in the exercise group . No significant changes in baseline parameters occurred in the control group . There were no critical changes in heart rate or blood pressure in either group . CONCLUSION Moderate-intensity step aerobics training significantly increases peak oxygen uptake and peak workloads in patients with severe chronic heart failure BACKGROUND The effect of home-based exercise training on neurovascular control in heart failure patients is unknown . AIMS To test the hypothesis that home-based training would maintain the reduction in muscle sympathetic nerve activity ( MSNA ) and forearm vascular resistance ( FVR ) acquired after supervised training . METHODS AND RESULTS Twenty-nine patients ( 54+/-1.9 years , EF<40 % ) were r and omised into two groups : untrained control ( n=12 ) and exercise trained ( n=17 ) . Both groups underwent assessment of Quality of Life ( QoL ) , MSNA , and forearm blood flow . The exercise group underwent a 4-month supervised training program followed by 4 months of home-based training . After the initial 4 months of training , patients in the exercise group showed a significant increase in peak VO(2 ) and reduction in MSNA , compared to the untrained group , but this was not maintained during 4 months of home-based training . In contrast , the decrease in FVR ( 56+/-3 vs. 46+/-4 vs. 40+/-2 U , p=0.008 ) and the improvement in QOL that were achieved during supervised training were maintained during home-based training . CONCLUSIONS Home-based training following supervised training is a safe strategy to maintain improvements in QoL and reduction in FVR in chronic heart failure patients , but is an inadequate strategy to maintain fitness as estimated by peak VO(2 ) or reduction in neurohumoral activation AIMS Benefit from exercise training in heart failure has mainly been shown in men with ischaemic disease . We aim ed to examine the effects of exercise training in heart failure patients < or = 75 years old of both sexes and with various aetiology . METHODS AND RESULTS Fifty-four patients with stable mild-to-moderate heart failure were r and omized to exercise or control , and 49 completed the study ( 49 % > or = 65 years ; 29 % women ; 24 % non-ischaemic aetiology ; training , n = 22 ; controls , n = 27 ) . The exercise programme consisted of bicycle training at 80 % of maximal intensity over a period of 4 months . Improvements vs controls were found regarding maximal exercise capacity ( 6 + /- 12 vs -4 + /- 12 % [ mean + /- SD ] , P < 0.01 ) and global quality -of-life ( 2 [ 1 ] vs 0 [ 1 ] units [ median ¿ inter-quartile range ¿ ] , P < 0.01 ) , but not regarding maximal oxygen consumption or the dyspnoea-fatigue index . All of these four variables significantly improved in men with ischaemic aetiology compared with controls ( n = 11 ) . However , none of these variables improved in women with ischaemic aetiology ( n = 5 ) , or in patients with non-ischaemic aetiology ( n = 6 ) . The training response was independent of age , left ventricular systolic function , and maximal oxygen consumption . No training-related adverse effects were reported . CONCLUSION Supervised exercise training was safe and beneficial in heart failure patients < or = 75 years , especially in men with ischaemic aetiology . The effects of exercise training in women and patients with non-ischaemic aetiology should be further examined OBJECTIVE Exercise training programs have been proposed as adjuncts to treatment of heart failure . The effects of a 3-month-exercise-training-program with 3 exercise sessions per week were assessed in patients with stable systolic chronic heart failure . METHODS We studied 24 patients with final left ventricle diastolic diameter of 70+/-10 mm and left ventricular ejection fraction of 37+/-4 % . Mean age was 52+/-16 years . Twelve patients were assigned to an exercise training group ( G1 ) , and 12 patients were assigned to a control group ( G2 ) . Patients underwent treadmill testing , before and after exercise training , to assess distance walked , heart rate , systolic blood pressure , and double product . RESULTS In G2 group , before and after 3 months , we observed , respectively distance walked , 623+/-553 and 561+/- 460 m ( ns ) ; peak heart rate , 142+/-23 and 146+/- 33b/min ( ns ) ; systolic blood pressure , 154+/-36 and 164+/-26 mmHg ( ns ) ; and double product , 22211+/- 6454 and 24293+/-7373 ( ns ) . In G1 group , before and after exercise , we observed : distance walked , 615+/-394 and 970+/- 537 m ( p<0.003 ) peak heart rate , 143+/-24 and 143+/-29b/min ( ns ) ; systolic blood pressure , 136+/-33 and 133+/-24 mmHg ( ns ) ; and double product , 19907+/- 7323 and 19115+/-5776 , respectively . Comparing the groups , a significant difference existed regarding the variation in the double product , and in distance walked . CONCLUSION Exercise training programs in patients with heart failure can bring about an improvement in physical capacity BACKGROUND Physical training improves exercise capacity in patients with chronic heart failure . It decreases plasma noradrenaline at rest , which may be prognostically favourable . The effect on atrial natriuretic peptide , another prognostic factor , and on catabolic and anabolic hormones remains unknown . Furthermore , to our knowledge , the contribution of exertional hormonal responses to the improved exercise capacity has not been evaluated . METHODS 27 patients with stable chronic heart failure ( New York Heart Association class II-III ) were r and omized to training ( n=12 ) and control ( n=15 ) groups . The training group exercised on a bicycle ergometer for 30 min three times a week for 3 months . The load corresponded to 50 - 60 % of their peak oxygen consumption . For the next 3 months they exercised at home according to personal instructions . The control group did not change its physical activities . The levels of hormones regulating the cardiovascular system and metabolism were determined at rest and after grade d maximal exercise and during exercise with constant submaximal workload . RESULTS Submaximal exercise capacity increased significantly and peak oxygen consumption tended to improve by 12 % in the training group . The plasma noradrenaline at rest tended to decrease by 19 % . The plasma level of N-terminal pro atrial natriuretic peptide did not change . Serum cortisol , a catabolic hormone , was normal at baseline and remained unchanged . The serum levels of anabolic hormones , growth hormone and insulin , as well as dehydroepi and rosteronesulfate and free testosterone were within a normal range at baseline . They were not altered by training . The dehydroepi and rosteronesulfate/cortisol , and the free testosterone/cortisol ratios , reflecting anabolic/catabolic balance , did not change , either . Training result ed in a higher peak noradrenaline response during grade d maximal exercise . The rise in serum cortisol during exercise tended to attenuate . CONCLUSION Physical training , which improves exercise capacity , does not have an unfavourable effect on anabolic/catabolic balance or neurohumoral activation in patients with congestive heart failure . It decreases plasma noradrenaline at rest . Minor changes in hormonal responses during exercise emerged after physical training which unlikely contribute to the improved exercise capacity Background —In chronic heart failure ( CHF ) , cross-talk between inflammatory activation and oxidative stress has been anticipated in skeletal muscle ( SM ) . The role of the radical scavenger enzymes superoxide dismutase ( SOD ) , catalase ( Cat ) , and glutathione peroxidase ( GPX ) , which remove oxygen radicals , has never been assessed in the SM in this context . Moreover , it remains unknown whether exercise training augments the activity of these enzymes in CHF . Methods and Results —Twenty-three patients with CHF were r and omized to either 6 months of exercise training ( T ) or a sedentary lifestyle ( C ) ; 12 age-matched healthy subjects ( HS ) were studied in parallel . Activity of Cat , SOD , and GPX was assessed in SM biopsies before and after 6 months ( 6 months ) . Oxidative stress was determined by measuring nitrotyrosine formation . SOD , Cat , and GPX activity was reduced by 31 % , 57 % , and 51 % , respectively , whereas nitrotyrosine formation was increased by 107 % in SM in CHF ( P<0.05 versus HS ) . In CHF , exercise training augmented GPX and Cat activity in SM by 41 % ( P<0.05 versus before and group C ) and 42 % ( P<0.05 versus before and group C ) , respectively , and decreased nitrotyrosine production by 35 % ( from 3.8±0.4 % tissue area before to 2.5±0.3 % after 6 months ; P<0.05 versus before ) . Conclusions —The reduced activity of major antioxidative enzymes in the SM of CHF patients is associated with increased local oxidative stress . Exercise training exerts antioxidative effects in the SM in CHF , in particular , due to an augmentation in activity of radical scavenger enzymes AIMS To assess whether a domiciliary programme of specific inspiratory muscle training in stable chronic heart failure results in improvements in exercise tolerance or quality of life . METHODS AND RESULTS We conducted a r and omized controlled trial of 8 weeks of inspiratory muscle training in 18 patients with stable chronic heart failure , using the Threshold trainer . Patients were r and omized either to a training group inspiring for 30 min daily at 30 % of maximum inspiratory mouth pressure , or to a control group of ' sham ' training at 15 % of maximum inspiratory mouth pressure . Sixteen of the 18 patients completed the study . Maximum inspiratory mouth pressure improved significantly in the training group compared with controls , by a mean ( SD ) of 25.4 ( 11.2 ) cmH2O ( P=0.04 ) . There were , however , no significant improvements in treadmill exercise time , corridor walk test time or quality of life scores in the trained group compared with controls . CONCLUSION Despite achieving a significant increase in inspiratory muscle strength , this trial of simple domiciliary inspiratory muscle training using threshold loading at 30 % of maximum inspiratory mouth pressure did not result in significant improvements in exercise tolerance or quality of life in patients with chronic heart failure OBJECTIVE To investigate the effect of physical training ( PTr ) on upper leg muscle area , muscle strength and muscle endurance expressed as upper leg muscle function ( ULMF ) in relation to exercise performance in CHF . DESIGN R and omised to a training ( TG ) or control group ( CG ) . SETTING Outpatient cardiac rehabilitation centre of community hospital . PATIENTS 77 CHF patients ( 59 men and 18 women ) , NYHA class II/III , age 59.8+/-9.3 years , LVEF 27+/-8 % . Sixteen patients dropped out during the intervention period , 61 patients ( M/F:46/15 ) completed the study . INTERVENTION PTr ( combined strength and endurance exercises ) four times per week , twice supervised and twice at home , during 26 weeks . MAIN OUTCOME MEASURES LVEF , body composition , daily physical activity , exercise performance , upper leg muscle area and isokinetic leg muscle variables . RESULTS Workload and peak oxygen consumption decreased in the CG ( -4.1 % and -4 % ) but increased in the TG ( + 5 % and + 4 % ) following PTr ( p<0.05 , ANOVA repeated measures ) . Hamstrings area decreased in the CG and did not change in the TG ( p<0.05 , ANOVA repeated measures ) . ULMF improved in the TG , but remained unchanged in the CG ( + 13.0 % and 0.0 , respectively , p<0.05 ; ANOVA repeated measures ) . At baseline and after intervention nearly 60 % of the variance in maximal workload was explained by ULMF and quadriceps muscle area ( multiple regression analysis ) . CONCLUSIONS In CHF patients , home-based training in conjunction with a supervised strength and endurance training program is safe , feasible and effective and does not require complex training equipment . Physical training prevented loss of hamstrings muscle mass and improved exercise performance by enhancing muscle strength and endurance Background Chronic heart failure ( CHF ) is associated with progressive muscle atrophy and reduced local expression of insulin-like growth factor I ( IGF-I ) . Design The present study was design ed to test the hypothesis that the local deficiency of IGF-I in the skeletal muscle of patients with CHF would respond to a 6-months aerobic training intervention . Therefore , 18 patients [ mean age 52.4 ( SD 4.8 ) years , left ventricular ejection function ( LVEF ) 27 ( SD 6)% ] were prospect ively r and omized to either 6 months of training or sedentary lifestyle . Methods Serum levels of growth hormone ( GH ) were measured by immunofluorometric assay , IGF-I by competitive solid phase immunoassay . IGF-I expression was assessed in vastus lateralis biopsies by real-time PCR . Results Exercise training led to a significant increase in peak oxygen uptake by 26 % [ from 20.3 ( SD 3.3 ) ml/kg per min to 25.5 ( SD 5.7 ) ml/kg per min , P=0.003 versus control ] . Local expression of IGF-I increased significantly after exercise training by 81 % [ from 6.3 ( SE 0.8 ) to 11.4 ( SE 1.4 ) relative units , P=0.007 versus control ] while IGF-I receptor expression was reduced by 33 % [ from 20.0 ( SE 2.1 ) to 13.8 ( SE 1.7 ) relative units , P=0.008 versus control ] . Serum growth hormone ( GH ) rose modestly from 0.12 ( SE 0.07 ) to 0.65 ( SE 0.37 ) ng/ml in the training group ( P=0.043 versus baseline ) , however , this change was not significant compared to the control group ( P=0.848 ) . IGF-I serum levels remained virtually unchanged . Conclusions Exercise training improves local IGF-I expression without significant changes of systemic parameters of the GH/IGF-I axis . These findings indicate that exercise training has the therapeutic potential to attenuate peripheral skeletal muscle alterations in particular with respect to local IGF-I expression in patients with moderate CHF OBJECTIVES We sought to analyze the systemic effects of lower-limb exercise training ( ET ) on radial artery endothelial function in patients with chronic heart failure ( CHF ) . BACKGROUND Local ET has the potential to improve local endothelial dysfunction in patients with CHF . However , it remains unclear whether the systemic effects can be achieved by local ET . METHODS Twenty-two male patients with CHF were prospect ively r and omized to either ET on a bicycle ergometer ( ET group , n = 11 ; left ventricular ejection fraction [ LVEF ] 26 + /- 3 % ) or an inactive control group ( group C , n = 11 ; LVEF 24 + /- 2 % ) . At the beginning of the study and after four weeks , endothelium-dependent and -independent vasodilation of the radial artery was determined by intra-arterial infusion of acetylcholine ( ACh-7.5 , 15 and 30 microg/min ) and nitroglycerin ( 0.2 mg/min ) . The mean internal diameter ( ID ) of the radial artery was assessed using a high resolution ultrasound system ( NIUS-02 , Asulab Research Laboratories , Neuchâtel , Switzerl and ) with a 10-MHz probe . RESULTS After four weeks of ET , patients showed a significant increase in the baseline-corrected mean ID in response to ACh ( 30 microg/min ) , from 33 + /- 10 to 127 + /- 25 microm ( p < 0.001 vs. control group at four weeks ) . In the control group , the response to ACh ( 30 microg/min ) remained unchanged . Endothelium-independent vasodilation was similar in both groups at the beginning of the study and at four weeks . In the training group , increases in agonist-mediated , endothelium-dependent vasodilation correlated to changes in functional work capacity ( r = 0.63 , p < 0.05 ) . CONCLUSIONS In patients with stable CHF , bicycle ergometer ET leads to a correction of endothelial dysfunction of the upper extremity , indicating a systemic effect of local ET on endothelial function The aim of this study was to investigate whether electrical stimulation of skeletal muscles could represent a rehabilitation alternative for patients with chronic heart failure ( CHF ) . Thirty patients with CHF and NYHA class II-III were r and omly assigned to a rehabilitation program using either electrical stimulation of skeletal muscles or bicycle training . Patients in the first group ( n = 15 ) had 8 weeks of home-based low-frequency electrical stimulation ( LFES ) applied simultaneously to the quadriceps and calf muscles of both legs ( 1 h/day for 7 days/week ) ; patients in the second group ( n = 15 ) underwent 8 weeks of 40 minute aerobic exercise ( 3 times a week ) . After the 8-week period significant increases in several functional parameters were observed in both groups : maximal VO2 uptake ( LFES group : from 17.5 + /- 4.4 mL/kg/min to 18.3 + /- 4.2 mL/kg/min , P < 0.05 ; bicycle group : from 18.1 + /- 3.9 mL/kg/min to 19.3 + /- 4.1 mL/kg/min , P < 0.01 ) , maximal workload ( LFES group : from 84.3 + /- 15.2 W to 95.9 + /- 9.8 W , P < 0.05 ; bicycle group : from 91.2 + /- 13.4 W to 112.9 + /- 10.8 W , P < 0.01 ) , distance walked in 6 minutes ( LFES group : from 398 + /- 105 m to 435 + /- 112 m , P < 0.05 ; bicycle group : from 425 + /- 118 m to 483 + /- 120 m , P < 0.03 ) , and exercise duration ( LFES group : from 488 + /- 45 seconds to 568 + /- 120 seconds , P < 0.05 ; bicycle group : from 510 + /- 90 seconds to 611 + /- 112 seconds , P < 0.03 ) . These results demonstrate that an improvement of exercise capacities can be achieved either by classical exercise training or by home-based electrical stimulation . LFES should be considered as a valuable alternative to classical exercise training in patients with CHF Little is known about the nutritional status of heart failure patients and the potential synergistic effects between nutritional intake and exercise . This small , r and omized trial examined the effects of a 3-month exercise program on body composition and nutritional intake in 31 men ( 17 exercisers ; 14 controls ) , aged 30 - 76 years ( mean , 56 years ) with stable class II-III heart failure . Baseline and 3-month evaluations included body mass index , body fat mass by triceps skinfold thickness , dietary intake by food frequency question naire , and the 6-minute walk test . Exercise consisted of walking 3 d/wk and resistance exercises 2 d/wk for 40 - 60 minutes . Dietary recommendations were consistent with the American Heart Association/American College of Cardiology heart failure guidelines . Exercisers decreased body weight ( p=0.001 ) , body mass index ( p=0.0001 ) , and triceps skinfold thickness ( p=0.03 ) and improved 6-minute walk test ( p=0.01 ) compared with controls . Exercisers also demonstrated trends toward decreased total caloric and cholesterol intake and a three-fold higher carbohydrate , fiber , and beta carotene intake vs. controls . In this study population , protein , fiber , and magnesium intake were below recommended daily allowance . After exercise , body mass index was reduced , accompanied by dietary modifications including greater intake of foods with higher moisture content . Further study is needed to investigate the interaction among diet , exercise , and weight BACKGROUND Exercise programs for patients with heart failure have often enrolled and evaluated relatively healthy , young patients . They also have not measured the impact of exercise performance on daily activities and quality of life . METHODS AND RESULTS We investigated the impact of a 6-month supervised and grade d exercise program in 33 elderly patients with moderate to severe heart failure r and omized to usual care or an exercise program . Six of 17 patients did not tolerate the exercise program . Of those who did , peak oxygen consumption increased by 2.4 + /- 2.8 mL/kg/min ( P < .05 ) and 6-minute walk increased by 194 ft ( P < .05 ) . However , outpatient energy expenditure did not increase , as measured by either the doubly labeled water technique or Caltrac accelerometer . Perceived quality of life also did not improve , as measured by the Medical Outcomes Study , Functional Status Assessment , or Minnesota Living With Heart Failure question naires . CONCLUSION Elderly patients with severe heart failure can safely exercise , with an improvement in peak exercise tolerance . However , not all patients will benefit , and daily energy expenditure and quality of life do not improve to the same extent as peak exercise BACKGROUND Congestive heart failure ( CHF ) is associated with increased peripheral vascular resistance . Exercise-induced shear stress may release endothelial relaxing factors , such as nitric oxide ( NO ) , and inhibit the production of vasoconstrictors such as endothelin-1 ( ET-1 ) thereby modulating vascular tone . We examined the effect of intensive training on ET-1 plasma concentrations and NO-metabolite elimination in patients with CHF after acute myocardial infa rct ion . METHODS Seventeen patients with CHF after a myocardial infa rct ion were r and omized to an exercise group ( n = 9 ) , who performed physical training for 8 weeks , or a control group ( n = 8) who received usual care . A physical examination , pulmonary function test , and a maximum exercise test were performed , and 24-hour urinary nitrate elimination and ET-1 in plasma were determined before and at the end of the study period . RESULTS Maximal oxygen uptake remained unchanged in controls ( 17.9 + /- 1.4 to 18.1 + /- 1.5 mL/(kg min ) but increased in the exercise group ( from 20.4 + /- 0.75 to 26.7 + /- 1.4 mL/(kg min ) . After 8 weeks the urinary nitrate elimination in controls was significantly decreased ( 1.25 + /- 0.20 to 1.03 + /- 0.22 mmol/24 hours ; P < 0.001 ) , while it was unchanged in the exercise group ( 1.26 + /- 0.23 to 1.39 + /- 0.28 ; P = 0.71 ) . Plasma ET-1 levels did not change after 8 weeks ( 7.87 + /- 0.62 versus 7.57 + /- 0.75 and 7.13 + /- 0.6 versus 7.35 + /- 0.7 pg/mL for control and exercise groups , respectively ) . CONCLUSION In patients with CHF after acute myocardial infa rct ion nitrate elimination decreases over the subsequent 2 months . This trend was reversed by training . Because nitrate elimination mirrors endogenous NO production , these results suggest that training may positively influence endothelial vasodilator function The effect of exercise training on quality of life and exercise capacity was studied in 67 patients with mild to moderate chronic heart failure ( CHF ; age : 65.6+/-8.3 years ; left ventricular ejection fraction : 26.5+/-9.6 % ) . Patients were r and omly allocated to either a training group or to a control group . After intervention a significantly larger decrease in Feelings of Being Disabled ( a subscale of the Heart Patients Psychological Question naire ) and a significantly larger increase in the Self- Assessment of General Well-Being ( SAGWB ) were observed in the training group . Exercise time and anaerobic threshold were increased in the training group only . The increase in exercise time was related to both Feelings of Being Disabled and SAGWB . We conclude that supervised exercise training improves both quality of life and exercise capacity and can be safely performed by chronic heart failure patients Background Chronic heart failure ( CHF ) is accompanied by an inflammatory activation which occurs both systemically and in the skeletal muscle . Exercise training has been shown to reduce the local expression of cytokines and inducible nitric oxide synthase ( iNOS ) in muscle biopsies of CHF patients . INOS-derived NO can inhibit oxidative phosphorylation and contribute to skeletal muscle dysfunction in CHF . Design To investigate the correlation between changes in local iNOS expression associated with regular exercise and changes in aerobic enzyme activities in the skeletal muscle of patients with CHF . Twenty male CHF patients [ ejection fraction 25 % ( SE 2 ) , age 54 ( SE 2 ) years ] were r and omized to a training ( n = 10 ) or a control group ( C , n = 10 ) . Methods At baseline and after 6 months skeletal muscle iNOS expression was measured by real-time polymerase chain reaction . INOS protein and protein nitrosylation were assessed by immunohistochemistry . Cytochrome c oxidase ( COX ) activity was quantified electrochemically using the Clark oxygen electrode . Results Exercise training led to a 27 % increase in cytochrome c oxidase activity [ from 21.8 ( SE 3.2 ) to 27.7 ( SE 3.5 ) nmol O2/mg per min , P=0.02 versus baseline ] . Changes in iNOS expression and iNOS protein content were inversely correlated with changes in COX-activity ( r= −0.60 , P=0.01 ; r= −0.71 , P<0.001 ) . Conclusions The inverse correlation between iNOS expression/iNOS protein content and COX-activity indicates that local anti-inflammatory effects may contribute to improved muscular oxidative metabolism BACKGROUND Resistance exercise training was applied to patients with chronic heart failure ( CHF ) on the basis that it may partly reverse deficiencies in skeletal muscle strength and endurance , aerobic power ( VO(2peak ) ) , heart rate variability ( HRV ) , and forearm blood flow ( FBF ) that are all putative factors in the syndrome . METHODS AND RESULTS Thirty-nine CHF patients ( New York Heart Association Functional Class=2.3+/-0.5 ; left ventricular ejection fraction 28%+/-7 % ; age 65+/-11 years ; 33:6 male : female ) underwent 2 identical series of tests , 1 week apart , for strength and endurance of the knee and elbow extensors and flexors , VO(2peak ) , HRV , FBF at rest , and FBF activated by forearm exercise or limb ischemia . Patients were then r and omized to 3 months of resistance training ( EX , n=19 ) , consisting of mainly isokinetic ( hydraulic ) ergometry , interspersed with rest intervals , or continuance with usual care ( CON , n=20 ) , after which they underwent repeat endpoint testing . Combining all 4 movement patterns , strength increased for EX by 21+/-30 % ( mean+/-SD , P<.01 ) after training , whereas endurance improved 21+/-21 % ( P<.01 ) . Corresponding data for CON remained almost unchanged ( strength P<.005 , endurance P<.003 EX versus CON ) . VO(2peak ) improved in EX by 11+/-15 % ( P<.01 ) , whereas it decreased by 10+/-18 % ( P<.05 ) in CON ( P<.001 EX versus CON ) . The ratio of low-frequency to high-frequency spectral power fell after resistance training in EX by 44+/-53 % ( P<.01 ) , but was unchanged in CON ( P<.05 EX versus CON ) . FBF increased at rest by 20+/-32 % ( P<.01 ) , and when stimulated by submaximal exercise ( 24+/-32 % , P<.01 ) or limb ischemia ( 26+/-45 % , P<.01 ) in EX , but not in CON ( P<.01 EX versus CON ) . CONCLUSIONS Moderate-intensity resistance exercise training in CHF patients produced favorable changes to skeletal muscle strength and endurance , VO(2peak ) , FBF , and HRV Exercise is an important behavior for long-term weight control in overweight and obese patients . However , little evidence exists confirming such findings in patients with advanced heart failure ( HF ) . Using a prospect i ve , experimental design , the effects of 24 weeks of a low-level , home-based walking program on weight loss were studied in overweight and obese ( body mass index > or = 27 kg/m(2 ) ) patients with advanced HF who were r and omized to exercise ( n = 48 ) and control ( n = 51 ) groups . Weight changes between the 2 groups at baseline and 6 months were compared using repeated- measures analysis of variance . Patients were on average aged 53.3 + /- 10.1 years and predominantly male ( 75 % ) , Caucasian ( 57 % ) , and married ( 55 % ) . Most patients were in New York Heart Association class III or IV ( 67 % ) , with a mean ejection fraction of 25 % . Patients in the exercise group showed significant weight reduction from baseline to 6 months compared with those in the control group ( -6.37 + /- 11.7 vs -0.33 + /- 9.3 kg , p = 0.002 ) . No significant differences were noted between the 2 groups in 6-minute walk distance or depression , although the changes were in the anticipated direction . Modest weight losses of > 5 % were associated with cardiopulmonary exercise test-documented workload levels at 6 months ( r = 0.331 , p = 0.006 ) , as well as decreased depression ( r = -0.315 , p = 0.01 ) and hostility ( r = -0.355 , p = 0.005 ) . The number of hospital admissions was significantly smaller for patients in the exercise group compared with those in the control group ( 0.63 + /- 0.94 vs 1.07 + /- 0.95 , p < 0.05 ) . In conclusion , the findings demonstrate the beneficial effects of a low-level , home-based walking program on weight loss in overweight and obese patients with advanced HF BACKGROUND Despite reported benefits of exercising for chronic heart failure patients , limited data are available on quality of life and the effects of different modes of training . This study assessed the effects of local endurance training with knee extensor muscles on exercise tolerance and health-related quality of life in male patients with moderate , chronic heart failure . METHODS AND RESULTS Twenty-one patients ( mean age , 60 years ; range , 43 - 73 years ) in New York Heart Association functional classes II-III ( ejection fraction , 28 + /- 11 % ) were r and omized to two training groups and one control group . Both training groups performed the same relative quantity of dynamic work with knee extensor muscles 3 days a week for 8 weeks . However , the quantity of muscle mass trained at one time and , consequently , the load on the integrated circulation differed between the groups ( two- and one-leg training ) . Exercise capacity and perceived quality of life were assessed before and after the training or control period . Exercise tolerance increased ( P < .01 ) in both training groups with significantly ( P < .01 ) better improvement in submaximal exercise capacity in the two-leg group . There was no improvement in the control group . Coping capacity did not differ from the reference range and did not change during the study . Global health-related quality of life was depressed at baseline . Training improved ( P < .05 ) health-related quality of life . Compared with the control group , the improvement of health-related quality of life subscales was more pronounced in the two-leg training group ( P < .02-.005 ) as compared to the one-leg training group ( not significant to P < .05 ) . CONCLUSIONS Local muscle endurance training has beneficial effects on exercise tolerance and health-related quality of life in patients with moderate , chronic heart failure . As two-leg training showed a tendency toward better improvement in submaximal exercise capacity and in quality of life than one-leg training , the effects on quality of life appear to be exercise-related in addition to a possible placebo-related effect . Also , the effect appears to be related to the extent of muscle trained at one time OBJECTIVES The aim of this study was to evaluate the effects of high intensity exercise training on left ventricular function and hemodynamic responses to exercise in patients with reduced ventricular function . BACKGROUND Results of studies on central hemodynamic adaptations to exercise training in patients with chronic heart failure have been contradictory , and some research has suggested that training causes further myocardial damage in these patients after a myocardial infa rct ion . METHODS Twenty-five men with left ventricular dysfunction after a myocardial infa rct ion or coronary artery bypass graft surgery were r and omized to an exercise training group ( mean age + /- SD 56 + /- 5 years , mean ejection fraction [ EF ] 32 + /- 7 % , n = 12 ) or a control group ( mean age 55 + /- 7 years , mean EF 33 + /- 6 % , n = 13 ) . Patients in the exercise group performed 2 h of walking daily and four weekly sessions of high intensity monitored stationary cycling ( 40 min at 70 % to 80 % peak capacity ) at a residential rehabilitation center for a period of 2 months . Ventilatory gas exchange and upright hemodynamic measurements ( rest and peak exercise cardiac output ; pulmonary artery , wedge and mean arterial pressures ; and systemic vascular resistance ) were performed before and after the study period . RESULTS Maximal oxygen uptake ( VO2max ) increased by 23 % after 1 month of training , and by an additional 6 % after month 2 . The increase in VO2max in the trained group paralleled an increase in maximal cardiac output ( 12.0 + /- 1.8 liters/min before training vs. 13.7 + /- 2.5 liters/min after training , p < 0.05 ) , but maximal cardiac output did not change in the control group . Neither stroke volume nor hemodynamic pressures at rest or during exercise differed within or between groups . Rest left ventricular mass , volumes and EF determined by magnetic resonance imaging were unchanged in both groups . CONCLUSIONS High intensity exercise training in patients with reduced left ventricular function results in substantial increases in VO2max by way of an increase in maximal cardiac output combined with a widening of maximal arteriovenous oxygen difference , but not changes in contractility . Training did not worsen hemodynamic status or cause further myocardial damage BACKGROUND Beneficial training outcomes have been reported in sedentary patients with chronic heart failure ( CHF ) after exercise training . However , data on training effects in previously trained patients , as well as comparisons of different exercise modes , are lacking . The aim of this study is to compare exercise training on a cycle ergometer ( major muscle mass ) and aerobic knee-extensor training ( minor muscle mass ) in previously trained patients with CHF . METHODS AND RESULTS Twenty-four men and women ( age , 63 + /- 10 years [ mean + /- SD ] ) with stable , moderate CHF ( left ventricular ejection fraction , 30 % + /- 11 % ) who had completed their first exercise training period more than 1 year ago were allocated to either the exercise or control group . After stratification for sex , age , ejection fraction , and cardiac output response , the training group was further r and omized to either cycle ergometer or knee-extensor training for 8 weeks . The control and training patients did not differ at baseline , and the measured variables did not change in the control group during the 8 weeks . Citrate synthase activity in skeletal muscle increased after cycle training ( 23 % ; P < .02 ) and knee-extensor training ( 45 % ; P < .008 ) , and blood lactate concentration at submaximal intensities decreased ( P < .04 ) in both groups . However , only after knee-extensor training did the peak oxygen uptake increase ( 19 % ; P < .01 ) and sympathetic nervous system activity , measured as plasma norepinephrine concentration at rest ( P < .05 ) and during exercise ( P < .008 ) , decrease . Minnesota Living with Heart Failure question naire scores also showed improvement in the health-related quality of life ( P < .05 ) only after knee-extensor training . CONCLUSION Physical training is beneficial in previously trained patients with CHF . Aerobic training involving a minor muscle mass shows greater efficiency than training involving a major muscle mass BACKGROUND Hospital-based exercise programs using a bicycle ergometer or a combination of exercise modalities have shown positive benefits in heart failure , but may not be readily accessible to many patients . Thus , we sought to evaluate the effects of a 12-week home walking exercise program on functional status and symptoms in patients with heart failure . METHODS A r and omized controlled trial comparing a 12-week progressive home walking exercise program ( n = 42 ) to a " usual activity " control group ( n = 37 ) was conducted in patients with heart failure ( 78 [ 99 % ] male ; mean age 62.6 + /- 10.6 years ; ejection fraction 27 % + /- 8.8 % ; 63 [ 80 % ] New York Heart Association class II ; 15[20 % ] New York Heart Association class III-IV ) from a Veterans Affairs medical center and a university-affiliated medical center . Functional status ( peak oxygen consumption via cardiopulmonary exercise testing , 6-minute walk test , the Heart Failure Functional Status Inventory ) , and symptoms ( Dyspnea-Fatigue Index score with a postglobal rating of symptoms ) were measured at baseline and 12 weeks . RESULTS No adverse events related to exercise training occurred . Overall mean compliance to training was 74 + /- 37 % . Peak oxygen consumption and the Heart Failure Functional Status Inventory were unchanged with training . Compared to the usual activity group , the training group had significantly longer walking distances measured by the 6-minute walk test ( 1264 + /- 255 vs 1337 + /- 272 feet , P = .001 ) , and improved postglobal rating of symptoms ( P = .03 ) . CONCLUSION In patients with heart failure , a progressive home walking exercise program is acceptable , increases walking distance , and decreases global rating of symptoms OBJECTIVES The present study was design ed to evaluate the effects of an ambulatory training program in patients with chronic heart failure ( CHF ) on the ultrastructural morphology of mitochondria and fiber type distribution of skeletal muscle and its relation to peripheral perfusion . BACKGROUND Recent studies in patients with CHF have suggested that intrinsic abnormalities in skeletal muscle can contribute to the development of early lactic acidosis and fatigue during exercise . METHODS ; Patients were prospect ively r and omized to either a training group ( n = 9 ; mean [ + /- SD ] left ventricular ejection fraction [ LVEF ] 26 + /- 10 ) participating in an ambulatory training program or to a physically inactive control group ( n = 9 ; LVEF 28 + /- 10 % ) . At baseline and after 6 months , patients underwent symptom-limited bicycle exercise testing with measurement of central and peripheral hemodynamic variables as well as percutaneous needle biopsies of the vastus lateralis muscle . The mitochondrial ultrastructure of skeletal muscle was analyzed by ultrastructural morphometry ; cytochrome c oxidase activity was visualized by histochemistry and subsequently quantitated by morphometry . The fiber type distribution was determined by adenosine triphosphatase staining . RESULTS After 6 months of exercise training there was a significant increase of 41 % in the surface density of cytochrome c oxidase-positive mitochondria ( SVMOcox+ ) ( p < 0.05 vs. control ) and of 43 % in the surface density of mitochondrial cristae ( SVMC ) ( p < 0.05 vs. control ) . Furthermore , exercise training induced a 92 % increase in the surface density of the mitochondrial inner border membrane ( p < 0.05 vs. control ) . In contrast , the total number of cytochrome c oxidase-positive mitochondria remained essentially unchanged . Exercise-induced improvement in peak oxygen uptake was closely linked to changes in SVMOcox+ ( p < 0.01 , r = 0.66 ) . After exercise training , changes in submaximal femoral venous lactate levels were not related to changes in submaximal leg blood flow ( r = -0.4 ) , but were inversely related to changes in the volume density of mitochondria ( p = 0.01 ; r = -0.6 ) as well as to changes in SVMC ( p < 0.05 ; r = -0.5 ) . After exercise training there was a " reshift " from type II to type I fibers ( p < 0.05 vs. control ) . CONCLUSIONS Patients with CHF who engage in regular physical exercise show enhanced oxidative enzyme activity in the working skeletal muscle and a concomitant reshift to type I fibers . These exercise-induced changes in oxidative capacity appear to be unrelated to changes in peripheral perfusion CONTEXT Exercise training in patients with chronic heart failure improves work capacity by enhancing endothelial function and skeletal muscle aerobic metabolism , but effects on central hemodynamic function are not well established . OBJECTIVE To evaluate the effects of exercise training on left ventricular ( LV ) function and hemodynamic response to exercise in patients with stable chronic heart failure . DESIGN Prospect i ve r and omized trial conducted in 1994 - 1999 . SETTING University department of cardiology/outpatient clinic in Germany . PATIENTS Consecutive sample of 73 men aged 70 years or younger with chronic heart failure ( with LV ejection fraction of approximately 0.27 ) . INTERVENTION Patients were r and omly assigned to 2 weeks of in-hospital ergometer exercise for 10 minutes 4 to 6 times per day , followed by 6 months of home-based ergometer exercise training for 20 minutes per day at 70 % of peak oxygen uptake ( n=36 ) or to no intervention ( control group ; n=37 ) . MAIN OUTCOME MEASURES Ergospirometry with measurement of central hemodynamics by thermodilution at rest and during exercise ; echocardiographic determination of LV diameters and volumes , at baseline and 6-month follow-up , for the exercise training vs control groups . RESULTS After 6 months , patients in the exercise training group had statistically significant improvements compared with controls in New York Heart Association functional class , maximal ventilation , exercise time , and exercise capacity as well as decreased resting heart rate and increased stroke volume at rest . In the exercise training group , an increase from baseline to 6-month follow-up was observed in mean ( SD ) resting LV ejection fraction ( 0.30 [ 0.08 ] vs 0.35 [ 0.09 ] ; P=.003 ) . Mean ( SD ) total peripheral resistance ( TPR ) during peak exercise was reduced by 157 ( 306 ) dyne/s/cm(-5 ) in the exercise training group vs an increase of 43 ( 148 ) dyne/s/cm(-5 ) in the control group ( P=.003 ) , with a concomitant increase in mean ( SD ) stroke volume of 14 ( 22 ) mL vs 1 ( 19 ) mL in the control group ( P=.03 ) . There was a small but significant reduction in mean ( SD ) LV end diastolic diameter of 4 ( 6 ) mm vs an increase of 1 ( 4 ) mm in the control group ( P<.001 ) . Changes from baseline in resting TPR for both groups were correlated with changes in stroke volume ( r=-0.76 ; P<.001 ) and in LV end diastolic diameter ( r=0.45 ; P<.001 ) . CONCLUSIONS In patients with stable chronic heart failure , exercise training is associated with reduction of peripheral resistance and results in small but significant improvements in stroke volume and reduction in cardiomegaly . JAMA . 2000 BACKGROUND Among the factors that contribute to limiting exercise tolerance in chronic heart failure are reduced peripheral blood flow and impaired vasodilatory capacity . Exercise training improves vasodilatory capacity in normal subjects , but controlled studies of exercise training evaluating upper and lower limb blood flow rates have not been performed in patients with reduced ventricular function . Improved vasodilatory capacity could help explain how training increases exercise capacity in these patients . METHODS Twenty patients ( mean age 55 + /- 6 years ) with reduced left ventricular function ( mean ejection fraction 32 % + /- 6 % ) after a myocardial infa rct ion were r and omized to a 2-month high-intensity residential rehabilitation program or to a control group and were monitored over the subsequent year . Both groups were treated according to current practice with angiotensin-converting enzyme inhibition therapy . Training began 1 month after myocardial infa rct ion . Baseline and postischemic flow rates were measured by plethysmography in both the upper and lower limbs 1 month , 3 months , and 1 year after the infa rct ion . Peak oxygen uptake ( VO2 ) and cardiac output were measured before and after training , and peak VO2 was determined again after 1 year . RESULTS After 2 months of training peak VO2 increased 25 % , VO2 at the lactate threshold increased 40 % , and maximal cardiac output increased from 12.1 + /- 1.6 L/min to 13.9 + /- 2.4 L/min in the exercise group ( all p < 0.05 ) , whereas no differences were observed in the control group . At the 1-year follow-up no further increases in peak VO2 were noted in either group , but the higher value persisted in the trained group . However , changes in limb flow rates were poorly related to changes in both peak VO2 and maximal cardiac output . Improvements in baseline and postischemic flow rates occurred mainly in the lower limbs and were observed in the two groups to a similar degree . CONCLUSION Exercise training is highly effective in improving exercise capacity in patients with reduced ventricular function after myocardial infa rct ion . These improvements parallel an increase in maximal cardiac output , but they are unrelated to vasodilatory capacity . In patients with reduced ventricular function after myocardial infa rct ion , lower limb vasodilatory capacity improves gradually over the subsequent year , and these improvements occur irrespective of exercise training BACKGROUND Chronic heart failure ( CHF ) is characterized by endothelial dysfunction . Vascular endothelium is important for control of haemostasis and vasoregulation . The aim of the present study was to investigate plasma levels of several endothelial markers and the exercise-induced changes on these plasma levels in CHF patients . Subsequently , the effect of a 6-month training programme on these markers is described . MATERIAL S AND METHODS Twenty-nine male CHF patients ( NYHA II/III , age 60 + /- 8 year , body mass index 26.7 + /- 2.3 kg m(-2 ) , left ventricular ejection fraction 26.3 - 7.2 % ; mean + /- SD ) participated . Patients were r and omly assigned to a training or control group . Training ( 26 weeks ; combined strength and endurance exercises ) was four sessions/week : two sessions supervised and two sessions at home . Before and after intervention , anthropometry , endothelial markers ( haemostasis and vasoregulation ) , maximal workload and peak oxygen uptake were assessed . RESULTS Physical training positively affected maximal workload . Plasma levels of endothelial markers were not affected by physical training and not related to exercise tolerance . After training , stimulated ( maximal exercise ) plasma von Willebr and Factor ( vWF ) release was present , whereas at baseline this release was absent . CONCLUSION Physical training led to normalization of the stimulated plasma vWF release . Plasma levels of other endothelial markers were not affected by physical training either at rest or under stimulated ( maximal exercise ) conditions OBJECTIVES We sought to evaluate the effect of physical training on neurohormonal activation in patients with heart failure ( HF ) . BACKGROUND Patients with HF benefit from physical training . Chronic neurohormonal activation has detrimental effects on ventricular remodeling and prognosis of patients with HF . METHODS A total of 95 patients with HF were assigned r and omly into two groups : 47 patients ( group T ) underwent a nine-month training program at 60 % of the maximal oxygen uptake ( VO2 ) , whereas 48 patients did not ( group C ) . The exercise load was adjusted during follow-up to achieve a progressive training effect . Plasma assay of B-type natriuretic peptide ( BNP ) , amino-terminal pro-brain natriuretic peptide ( NT-proBNP ) , norepinephrine , plasma renin activity , and aldosterone ; quality -of-life question naire ; echocardiogram ; and cardiopulmonary stress test were performed upon enrollment and at the third and ninth month . RESULTS A total of 85 patients completed the protocol ( 44 in group T , left ventricular ejection fraction [ EF ] 35 + /- 2 % , mean + /- SEM ; and 41 in group C , EF 32 + /- 2 % , p = NS ) . At the ninth month , patients who underwent training showed an improvement in workload ( + 14 % , p < 0.001 ) , peak VO2 ( + 13 % , p < 0.001 ) , systolic function ( EF + 9 % , p < 0.01 ) , and quality of life . We noted that BNP , NT-proBNP , and norepinephrine values decreased after training ( -34 % , p < 0.01 ; -32 % , p < 0.05 ; -26 % , p < 0.01 , respectively ) . Increase in peak VO2 with training correlated significantly with the decrease in both BNP/NT-proBNP level ( p < 0.001 and p < 0.01 , respectively ) . Patients who did not undergo training showed no changes . CONCLUSIONS Clinical benefits after physical training in patients with HF are associated with blunting of adrenergic overactivity and of natriuretic peptide overexpression BACKGROUND The effect that supervised or unsupervised exercise training has on aerobic capacity ( peak oxygen consumption [ VO2peak ] ) , muscle strength and quality of life in older women with heart failure remains unknown . OBJECTIVE To examine the effect of six months ( three months supervised followed by three months unsupervised ) of aerobic training ( AT ) or combined aerobic and strength training ( CAST ) on VO2peak , muscle strength and quality of life in older women with heart failure . METHODS Twenty older women ( mean age + /- SD , 72+/-8 years ) with clinical ly stable heart failure were r and omly assigned to AT ( n=10 ) or CAST ( n=10 ) . Supervised AT was performed two days per week at 60 % to 70 % heart rate reserve , whereas unsupervised training was performed two days per week at a rate of perceived exertion of 12 to 14 on the Borg scale . The CAST group also performed one to two sets of low-to-moderate intensity strength training two days per week . RESULTS Supervised AT or CAST result ed in an increase in VO2peak ( 12 % ; P<0.05 ) and leg press strength ( 13 % ; P<0.05 ) that returned to baseline after unsupervised training . Vertical row strength was greater ( + 23 % ; P<0.05 ) after supervised CAST and remained unchanged after supervised or unsupervised AT . Supervised or unsupervised exercise training was not associated with a significant change in quality of life . CONCLUSIONS Supervised AT or CAST are effective modes of exercise to improve VO2peak and muscle strength in older women with heart failure . However , the improvements in VO2peak and muscle strength are not maintained with unsupervised exercise training BACKGROUND The purpose of this study was to examine the effects of exercise training on functional capacity in patients with heart failure . METHODS One hundred eighty-one patients in New York Heart Association class I to III , with ejection fraction < 40 % and 6-minute walk distance < 500 meters , were recruited into a r and omized , controlled , single-blind trial comparing 3 months of supervised training , then 9 months of home-based training with usual care . RESULTS There was a significant increase in 6-minute walk distance at 3 and 12 months but no between-group differences . Incremental peak oxygen uptake increased in the exercise group compared with the control group at 3 months ( 0.104 + /- 0.026 L/min vs 0.025 + /- 0.023 L/min ; P = .026 ) and 12 months ( 0.154 + /- 0.074 L/min vs 0.024 + /- 0.027 L/min ; P = .081 ) . Compared with the control group , significant increases were observed in the exercise group for arm and leg strength . No significant changes were observed in cardiac function or quality of life . Adherence to exercise was good during supervised training but reduced during home-based training . CONCLUSIONS Exercise training improves peak oxygen uptake and strength during supervised training . Over the final 9 months of the study , there was little further improvement , suggesting that some supervision is required for these patients . There were no adverse effects on cardiac function or clinical events BACKGROUND Beneficial training outcomes have been reported in patients with chronic heart failure ( CHF ) following leg exercise training . However , data from more comprehensive training programs are limited . The aim of this study was to test the hypothesis that exercise training applying the concept of comprehensive local muscle training can improve aerobic and functional working capacity as well as quality of life in patients with CHF . METHODS Twenty-four men and women [ age 63+/-9 years ( mean+/-S.D. ) ] with stable , moderate chronic heart failure ( left ventricular ejection fraction 30+/-10 % ) , were investigated in a r and omized controlled study with a training group of 16 patients and a control group of 8 patients . The training was performed as an aerobic resistance training by activating all the main muscle groups , one at a time . The patients exercised for 1 h , three times per week for 8 weeks . RESULTS Patient groups did not differ at baseline . Peak oxygen uptake ( 8 % , P<0.03 ) , the distance walked in a 6-min walking test ( 11 % , P<0.002 ) , the health-related quality of life ( P<0.001 ) and plasma norepinephrine levels at rest ( 32 % , P<0.003 ) and at submaximal intensities ( P<0.03 ) improved after training . No changes were found in the control group , except for decreased peak oxygen uptake ( P<0.02 ) and quality of life scores ( P<0.03 ) . CONCLUSIONS Since comprehensive physical training activating a minor muscle mass at a time markedly improves exercise capacity and quality of life and reduces catecholamine levels , it can be recommended for the rehabilitation of patients with CHF under supervision of a physical therapist The present study investigates whether lower-limb dominant exercise training in patients with chronic heart failure ( CHF ) improves endothelial function primarily in the trained lower extremities or equally in the upper and lower extremities . Twenty-eight patients with CHF were r and omized to the exercise or control group . The exercise group underwent cycle ergometer training for 3 months while controls continued an inactive sedentary lifestyle . Exercise capacity ( 6-min walk test ) and flow-mediated vasodilation in the brachial and posterior tibial arteries were evaluated . After 3 months , walking performance increased only in the exercise group ( 488+/-16 to 501+/-14 m [ control ] ; 497+/-23 to 567+/-39 m [ exercise , p<0.05 ] ) . The flow-mediated vasodilation in the brachial arteries did not change in either group ( 4.2+/-0.5 to 4.5+/-0.4 % [ control ] ; 4.3+/-0.5 to 4.6+/-0.4 % [ exercise ] ) , but that in the posterior tibial arteries increased only in the exercise group ( 4.1+/-0.5 to 4.1+/-0.3 % [ control ] ; 3.6+/-0.3 to 6.4+/-0.6 % [ exercise , p<0.01 ] ) . Cycle ergometer training improved flow-mediated vasodilation in the trained lower limbs , but not in the untrained upper limbs . Exercise training appears to correct endothelial dysfunction predominantly by a local effect in the trained extremities Background Physical training currently constitutes an important part of treatment of heart failure patients . So far , no data are available on the effects of regular exercise in elderly ( aged > 65 years ) heart failure patients . Methods In a prospect i ve trial , patients with chronic heart failure ( New York Heart Association class II and III ) were r and omly assigned to a training group and a control group . Patients in the training group performed additional exercises three times a week , while patients in the control group continued regular treatment . To analyse the influence of age , both groups were subdivided into subjects younger than and older than 65 years . The effect of training on exercise parameters was evaluated by means of a treadmill test . Quality of life aspects were evaluated with the help of the Heart Patients Psychological Question naire and a single- question Self Awareness of General Well-Being test . Results Comparison of changes between groups revealed that training increased the duration of the exercise test and improved aspects of quality of life in the trained patients aged both younger than and older than 65 years . Conclusion Exercise training is equally effective in patients aged younger than and older than 65 years The aim of this study was to evaluate the impact of a three-month exercise program on the perception of quality of life in patients with severe chronic heart failure . In a r and omized controlled setting , 27 patients with a left ventricular ejection fraction of 18.1 + /- 8.0 % were entered into the study . The training group performed aerobic exercises for three hours/week while the control group continued their usual activities of daily living . Quality of life was measured using the German version of the MOS SF-36 . Two patients required a change in their drug regimen and were therefore withdrawn from the study . Twenty-five patients completed the study . In the exercise group the perception of quality of life improved significantly in the domains of vitality ( p = 0.0001 ) , physical role fulfillment ( p = 0.001 ) , physical ( p = 0.02 ) and social ( p = 0.0002 ) functioning . Exercise was effective in increasing peak oxygen uptake and exercise time ( p < 0.01 ) . Only weak correlations were registered between parameters of physical performance and quality of life domains . The results of the study indicate that aerobic exercise can improve the perception of quality of life in patients with severe chronic heart failure PURPOSE The purpose of this study was to determine whether subjects with chronic heart failure , who completed a 12-week rehabilitation program , would have significantly greater quality of life , better aerobic fitness , less difficulty with symptoms of heart failure , greater self-efficacy for exercise , and higher daily activity levels when compared with subjects in a control group . METHODS Thirty-one males , aged 64 + /- 10 years with left ventricular ejection fraction of 29 + /- 7 % , were r and omized to a moderate intensity supervised aerobic exercise program ( n = 15 ) or a control group ( n = 16 ) . Twenty-seven subjects completed at least 1 follow-up assessment . RESULTS After 12 weeks there were significant differences in the change scores for perceived physical function ( using R AND Corporation 's 36-item short form ) ( P = .025 ) and peak oxygen uptake ( P = .019 ) between the exercise and control groups with the exercise group experiencing improved physical function and fitness . CONCLUSIONS Exercise training in adults with heart failure increases exercise tolerance and perceived physical function . Improved heart failure symptoms , self-efficacy for exercise , or increased physical activity may not be associated with enhancement of exercise tolerance The aim of the study was to evaluate , in a controlled setting , the effects of a 5-month dynamic peripheral training programme in patients with clinical signs of congestive heart failure with special reference to their anaerobic threshold , muscle function , heart rate variability and quality of life . Twenty-four r and omized patients with clinical signs of heart failure in NYHA II-III entered the study . Training result ed in a significant ( p = 0.01 ) change in the anaerobic threshold , the patients ' ability to lift weights ( p = 0.01 ) and performance of heel-lift ( p = 0.01 ) . The heart rate recorded during the training exercises decreased significantly ( p = 0.04 ) . There were no significant differences in peak oxygen uptake , isokinetic and isometric strength , HRV and quality of life except for three items in the control group . The results of this study indicate that peripheral training is beneficial for patients with clinical signs of congestive heart failure Patients with heart failure ( HF ) often have profound activity limitations and diminished quality of life ( QOL ) due to symptoms of dyspnea and fatigue . Although recent studies demonstrate positive physiologic and psychological benefits of low to moderate intensity , supervised , aerobic exercise training performed 3 to 5 days/ week for 20 to 40 minutes ' duration , in a monitored setting , the efficacy of a home-based exercise program combining endurance and resistance exercise on symptoms and QOL , are unknown . This r and omized controlled study examined the efficacy , safety , and adherence rates of a 3-month home-based combined walking and resistance exercise program on symptoms and QOL in 40 women and men aged 30 to 76 years with New York Heart Association class II to III HF . Baseline and 3-month evaluations consisted of a chronic HF question naire to assess symptoms and QOL and exercise capacity by symptom-limited treadmill exercise test with respiratory gas analysis . The exercise intervention improved fatigue ( p = 0.02 ) , emotional function ( p = 0.01 ) , and mastery ( p = 0.04 ) . Overall exercise adherence was excellent ( 90 % ) and there were no reported adverse events . A moderate intensity home-based combined walking and resistance program for patients with class II to III HF is safe and effective in reducing symptoms and improving QOL The role of exercise training in the treatment of heart failure remains a paradox , because exercise intolerance is a characteristic finding in patients with this condition . Improved exercise performance after exercise training has been seen in patients with heart failure , but such training is not yet widely incorporated into clinical practice . In addition , the mechanisms by which this improvement occurs are uncertain . Factors that may explain the improvement include an increase in cardiac output [ 1 , 2 ] , an improvement in skeletal muscle metabolism [ 3 - 5 ] , and an increase in peak blood flow to the exercising limb that is caused by a decrease in vascular resistance [ 1 , 2 , 5 ] . Five r and omized trials [ 5 - 9 ] have assessed exercise performance after exercise training in patients with symptomatic heart failure . Two trials that assessed cardiorespiratory fitness as measured by oxygen consumption ( Vo 2 ) showed a 22 % improvement in peak Vo 2 after 4 weeks of exercise training [ 6 ] and a 31 % improvement after 6 months of exercise training [ 5 ] . Two other trials [ 7 , 8 ] showed an increase in exercise duration after 12 weeks of exercise training , and a fifth trial [ 9 ] recently found no significant increase in peak Vo 2 after exercise training . We sought to assess the benefit of exercise training in patients with heart failure caused by left ventricular systolic dysfunction and to further describe the physiologic changes associated with exercise training . Functional capacity and cardiorespiratory fitness were measured before and after exercise training in patients with compensated heart failure who were receiving st and ard medical therapy . Patients were r and omly assigned either to a group that participated in a program of three supervised exercise sessions per week for 24 weeks ( exercise group ) or to a control group that did not exercise . Methods Patients Forty men with compensated heart failure and left ventricular dysfunction were r and omly assigned to the exercise group ( n = 21 ) or the control group ( n = 19 ) . R and omization was done according to a computer-generated r and omization list . Patients were recruited from the outpatient heart failure clinic of a tertiary care hospital or from the office of a cardiologist practicing in the community . Patients meeting the eligibility criteria for the study first received a brief explanation of the study from their cardiologist . Patients willing to participate in the r and omized trial were then referred . The hospital 's institutional review board approved our study , and all patients provided written informed consent . Inclusion criteria were New York Heart Association class II or III , a resting ejection fraction of 35 % or less as measured by echocardiography or gated equilibrium radionuclide angiography , and no change in medical therapy for 30 days before r and omization . Exclusion criteria were atrial fibrillation , acute myocardial infa rct ion within the previous 3 months , angina pectoris at rest or induced by exercise , current enrollment in another clinical trial , and current participation in a regular exercise program ( at least twice weekly ) . Patients were classified as having ischemic cardiomyopathy if they had had a myocardial infa rct ion or had angiographic evidence of coronary artery disease that could explain the extent of ventricular dysfunction . If they did not meet these criteria , they were classified as having idiopathic dilated cardiomyopathy . Study Design Exercise tests were completed before and after a 30-day pre study period . The pre study period was used to document the stability of exercise tolerance , clinical conditions , and prescribed medications . Immediately after completing the second exercise test , patients were r and omly assigned to the exercise group or the control group . Each patient 's assignment was sealed in an envelope until completion of the second exercise test . Exercise testing was repeated at weeks 12 and 24 for patients in both groups . Each patient 's physician was asked not to change a patient 's drug regimen during the study , if possible . Patients in the control group were instructed to maintain their normal daily activity habits and not to begin an exercise regimen . Controls were contacted by telephone every 2 to 3 weeks to assess compliance , cardiac-related symptoms , and continued avoidance of a regular activity program . Exercise Testing Symptom-limited , maximal exercise tests were completed using an upright stationary cycle ergometer ( Monark , Stockholm , Sweden ) , starting at a power output of 25 W and increasing by 25 W every 3 minutes . Tests were discontinued when dyspnea or calf , thigh , or generalized fatigue developed . Patients were monitored by electrocardiography ( Q-3000 , Quinton Instruments , Seattle , Washington ) at rest , during exercise , and during 8 minutes of recovery . Blood pressure , heart rate , rating of perceived exertion ( categorized on the Borg scale ; range , 6 to 20 ) , and a 12-lead electrocardiogram were obtained after 30 minutes of supine rest , within the last 25 seconds of each stage of exercise , and during peak exercise . Air expired during exercise testing was analyzed using a Horizon II Metabolic System ( Sensormedics , Yorba Linda , California ) . Direct measurement and calculations were used to determine peak Vo 2 , carbon dioxide production ( Vco 2 ) , ventilation , oxygen ( O2 ) pulse , and respiratory exchange ratio . Expired air was sample d at a rate of 10 per second and reported as a 15-second average . Exercise tests in the exercise group and the control group represented peak effort , as evidence d by ratings of perceived exertion that were generally 16 or more and respiratory exchange ratios that were greater than 1.1 . Ventilatory derived anaerobic threshold ( V-AT ) was determined using the V-slope method , originally defined by Beaver and colleagues [ 10 ] and later simplified by others [ 11 - 13 ] . This measure , in which Vco 2 is plotted as a function of Vo 2 , can be used to detect the beginning of excess carbon dioxide production caused by the buffering of H+ that arise from lactic acid . Two independent , experienced review ers blinded to the patients ' group assignment and testing periods determined V-AT . For four patients in the exercise group and five patients in the control group , a review er could not determine V-AT in at least one of each patient 's three exercise tests . We also computed the slope of the relation between ventilation and Vco 2 as a marker of the severity of heart failure [ 14 ] . Exercise Training Each exercise training session lasted 43 minutes . During each session , patients completed a 5-minute , slow warm-up phase , a 33-minute aerobic phase ( three different types of exercise equipment were used for 11 minutes each ) , and a 5-minute cool-down phase . Exercise equipment included motor-driven treadmills , stationary cycles , rowing machines , and arm ergometers . Patients attended the exercise training program three times per week . Using the heart rate reserve method [ 15 ] , we set exercise intensity at 60 % for the first 2 weeks and then increased it , as tolerated , to as high as 80 % . A rating of perceived exertion of 12 to 14 was also used to guide exercise intensity . Heart rate and rhythm were monitored during exercise using a single-lead electrocardiography telemetry system . Statistical Analysis Of the 40 patients entered into the study , only those who also completed the exercise tests at weeks 12 and 24 were considered in the data analysis . The 5 patients who dropped out for nonmedical reasons were asked to return for follow-up testing , but they refused . We compared patient characteristics at baseline using an unpaired t-test or the Fisher exact test . We used univariate repeated- measures analysis of variance with the Greenhouse-Geisser [ 16 ] sphericity correction to determine whether a significant ( P < 0.05 ) difference in the change across time occurred between the two groups . For variables for which a significant ( or a tendency toward a significant ) time-group interaction was detected , we used analysis of variance to assess a within-group time effect and used a Student two- sample t-test to assess a group effect . For the latter two analyses , we used the Bonferroni multiple testing adjustment to reduce the level accordingly . Values are expressed as means SE . The SAS software package ( SAS Institute , Cary , North Carolina ) was used for all analyses . Results Compliance , Medical Therapy , and Safety Among patients who completed the study , no differences in demographic characteristics were seen between the two study groups after r and omization ( Table 1 ) . Of the 40 patients r and omly assigned at baseline , 29 completed the study and 11 dropped out ( Table 1 ) . Regardless of the reason for drop out , patients with ischemic cardiomyopathy tended to drop out more frequently ( 7 of 16 patients [ 44 % ] ) than did patients with dilated cardiomyopathy ( 4 of 24 patients [ 17 % ] ) ( P = 0.08 ) . In addition , ejection fraction tended to be lower in patients who dropped out than in those who did not ( 18 % 5 % compared with 23 % 8 % ; P = 0.09 ) , and patients who dropped out tended to be older ( 61 10 years compared with 54 11 years ; P = 0.07 ) . No differences were seen in New York Heart Association class ( II compared with III ) between patients who completed the study and those who dropped out . Table 1 . Baseline Patient Characteristics * Fifteen patients in the exercise group completed the study . Two patients dropped out because of noncardiac medical conditions ( progressive , limiting arthritis in one patient and newly diagnosed cancer in the other ) that developed within 1 month of the start of the exercise program . One patient developed atrial fibrillation between week 12 and week 24 ; 3 other patients stopped exercising for personal reasons before week 12 and refused follow-up testing . Fourteen of the 19 patients in the control group completed the study . Two dropped out for personal reasons and refused follow-up testing , 1 developed atrial fibrillation between week 12 and week 24 , 1 was BACKGROUND Supervised cardiac rehabilitation programs have been offered to patients following myocardial infa rct ( MI ) , coronary artery bypass graft surgery ( CABG ) and percutaneous coronary intervention ( PCI ) for many years . However , limited information is available on the usefulness of rehabilitation programs in chronic heart failure ( CHF ) . The aim of our study was to evaluate the outcome of supervised physical training on CHF patients by measuring both central and peripheral factors . METHODS This was a prospect i ve r and omized study , including 43 patients with CHF , New York Heart Association ( NYHA ) class II or III , mean age 68 years . After initial measurements of VO2 peak , 6 min walk distance , muscle strength , plasma levels of atrial natriuretic peptide ( ANP ) and brain natriuretic peptide ( BNP ) , echocardiogram , measurements of pulmonary function and quality of life question naire , patients were r and omized to either a training group ( n = 21 ) or a control group ( n = 22 ) . The training group had supervised aerobic and resistance training program twice a week for five months . After the training program was completed , all measurements were repeated in both groups . RESULTS No training related adverse events were reported . Significant improvement was found between groups in the six minute walk test ( + 37.1 m vs. + 5.3 m , p = 0.01 ) , work load on the bicycle exercise test ( + 6.1 W vs. + 2.1 W , p = 0.03 ) , time on the bicycle exercise test ( + 41 s vs. + 0 s , p = 0.02 ) and quadriceps muscle strength test ( + 2.8 kg . vs. + 0.2 kg . , p = 0.003 ) . Quality of life factors that reflect exercise tolerance and general health , improved significantly in the training group compared to the control group . No other significant changes were found between the two groups . CONCLUSION Supervised physical training as used in this study appears safe for CHF patients in NYHA class II or III . The improvement in functional capacity observed in the training group seems to be related to peripheral factors rather than in central cardiovascular performance Quittan M , Wiesinger GF , Sturm B , Puig S , Mayr W , Sochor A , Paternostro T , Resch KL , Pacher R , Fialka-Moser V : Improvement of thigh muscles by neuromuscular electrical stimulation in patients with refractory heart failure : a single-blind , r and omized , controlled trial . Am J Phys Med Rehabil 2001;80:206–214 . Objective To determine the impact of an 8-wk neuromuscular stimulation program of thigh muscles on strength and cross-sectional area in patients with refractory heart failure listed for transplantation . Design Forty-two patients with a stable disease course were assigned r and omly to a stimulation group ( SG ) or a control group ( CG ) . The stimulation protocol consisted of biphasic symmetric impulses with a frequency of 50 Hz and an on/off regime of 2/6 sec. Results Primary outcome measures were isometric and isokinetic thigh muscle strength and muscle cross-sectional area . Our results showed an increase of muscle strength by mean 22.7 for knee extensor and by 35.4 for knee flexor muscles . The CG remained unchanged or decreased by −8.4 in extensor strength . Cross-sectional area increased in the SG by 15.5 and in the CG by 1.7 . Conclusions Activities of daily living as well as quality of life increased in the SG but not in the CG . Subscales of the SF-36 increased significantly in the SG , especially concerning physical functioning by + 7.5 ( 1.3–30.0 ) , emotional role by + 33.3 ( 0–66.6 ) , and social functioning by + 18.8 ( 0–46.9 ) , all P < 0.05 . Neither a change nor a decrease was observed in the CG . Neuromuscular electrical stimulation of thigh muscles in patients with refractory heart failure is effective in increasing muscle strength and bulk and positively affects the perception of quality of life and activities of daily living Eighty-two patients aged > or = 70 years with heart failure were r and omized to a gentle , seated exercise program or to usual care . Six-minute walk distance and quality of life did not change between groups , but daily activity as measured by accelerometry increased in the exercise group relative to the control group OBJECTIVES The purpose of this study was to demonstrate in patients with moderate to severe heart failure that exertional dyspnea can be alleviated by improving muscle function . BACKGROUND Dyspnea is a frequent limiting symptom in patients with chronic heart failure ( CHF ) . This sensation may originate from activation of receptors in the musculature rather than the lung . METHODS To investigate whether dyspnea could be alleviated by selective changes in leg muscle function , we performed isolated lower-limb training in 17 patients with severe CHF . Eight patients learned guided imagery relaxation techniques and served as an active control group . Exercise training consisted of three months of low-level bicycle and treadmill exercise such that minute ventilation was < 25 l/min . Leg calisthenics were also performed . Maximal and submaximal exercise performance , respiratory and quadriceps muscle strength and endurance and quality -of-life and dyspnea scales were measured before and after each intervention . Metabolic stress testing ( VO(2 ) ) , pulmonary function tests and isokinetic strength testing were also performed . RESULTS In the active control group , no changes in leg muscle function , pulmonary function , maximal and submaximal exercise performance or quality -of-life question naires were observed . In the training group , peak torque of leg flexors ( pre : 39 + /- 15 ft-lb ; post : 50 + /- 13 ft-lb ; p < 0.002 ) increased and the fatigue ratio decreased , indicating improved strength and endurance of the leg muscles . Maximal inspiratory and expiratory mouth pressures and maximum voluntary ventilation were unchanged . Peak VO(2 ) was increased ( pre:12 + /- 2.2 ml/kg/min ; post : 14 + /- 2.6 ml/kg/min ) as well as the duration of exercise at 70 % peak VO(2 ) increased ( pre : 11.5 + /- 3.1 min ; post : 21.5 + /- 5.4 min ; p < 0.003 ) . Perceived dyspnea during the submaximal testing was decreased . Minnesota Living with Heart Failure Score , Guyatt Dyspnea Scale , and the Transitional Dyspnea Index were all improved with training ( all p < 0.05 ) . CONCLUSIONS We concluded that improvement of limb muscle function alleviates dyspnea and improves exercise performance in patients with CHF BACKGROUND Due to dyspnea and fatigue , patients with chronic heart failure ( CHF ) are often restricted in the performance of everyday activities , which gradually may lead to hypoactivity . AIMS To assess whether aerobic training leads to a more active lifestyle and improved quality of life ( QoL ) in patients with CHF . METHODS Patients with stable CHF ( NYHA II/III ; 59 ( 11 ) years ) were r and omly assigned to a training group ( n=18 ; 3-month aerobic program above st and ard treatment ) or control group ( n=16 ; st and ard treatment without special advice for exercise ) . Measurements were performed on level of everyday physical activity ( PA , novel accelerometry-based activity monitor ) and QoL , and on several related parameters . RESULTS Training did not result in a more active lifestyle or improved QoL , but improved ( P<0.05 ) peak power ( 17 % ) , 6-min walk distance ( 10 % ) , muscle strength ( 13 - 15 % ) and depression ( -1.3 unit ) . Changes in level of everyday PA were related to changes in peak Vo(2 ) ( r=0.58 , P=0.01 ) and knee extension strength ( r=0.48 , P=0.05 ) . CONCLUSIONS At group level training did not result in a more active lifestyle or improved QoL. However , correlations between training-related changes in parameters suggest that aerobic training has the potential to increase levels of everyday PA in CHF BACKGROUND It is still a matter of debate whether exercise training ( ET ) is a beneficial treatment in chronic heart failure ( CHF ) . METHODS AND RESULTS To determine whether long-term moderate ET improves functional capacity and quality of life in patients with CHF and whether these effects translate into a favorable outcome , 110 patients with stable CHF were initially recruited , and 99 ( 59+/-14 years of age ; 88 men and 11 women ) were r and omized into 2 groups . One group ( group T , n=50 ) underwent ET at 60 % of peak & f1;O2 , initially 3 times a week for 8 weeks , then twice a week for 1 year . Another group ( group NT , n=49 ) did not exercise . At baseline and at months 2 and 14 , all patients underwent a cardiopulmonary exercise test , while 74 patients ( 37 in group T and 37 in group NT ) with ischemic heart disease underwent myocardial scintigraphy . Quality of life was assessed by question naire . Ninety-four patients completed the protocol ( 48 in group T and 46 in group NT ) . Changes were observed only in patients in group T. Both peak & f1;O2 and thallium activity score improved at 2 months ( 18 % and 24 % , respectively ; P<0 . 001 for both ) and did not change further after 1 year . Quality of life also improved and paralleled peak VO2 . Exercise training was associated both with lower mortality ( n=9 versus n=20 for those with training versus those without ; relative risk (RR)=0.37 ; 95 % CI , 0.17 to 0.84 ; P=0.01 ) and hospital readmission for heart failure ( 5 versus 14 ; RR=0.29 ; 95 % CI , 0.11 to 0.88 ; P=0.02 ) . Independent predictors of events were ventilatory threshold at baseline ( beta-coefficient=0.378 ) and posttraining thallium activity score ( beta-coefficient -0.165 ) . CONCLUSIONS Long-term moderate ET determines a sustained improvement in functional capacity and quality of life in patients with CHF . This benefit seems to translate into a favorable outcome OBJECTIVES The goal of this study was to test the hypothesis that exercise training reduces resting sympathetic neural activation in patients with chronic advanced heart failure . BACKGROUND Exercise training in heart failure has been shown to be beneficial , but its mechanisms of benefit remain unknown . METHODS Sixteen New York Heart Association class II to III heart failure patients , age 35 to 60 years , ejection fraction < or = 40 % were divided into two groups : 1 ) exercise-trained ( n = 7 ) , and 2 ) sedentary control ( n = 9 ) . A normal control exercise-trained group was also studied ( n = 8) . The four-month supervised exercise training program consisted of three 60 min exercise sessions per week , at heart rate levels that corresponded up to 10 % below the respiratory compensation point . Muscle sympathetic nerve activity ( MSNA ) was recorded directly from peroneal nerve using the technique of microneurography . Forearm blood flow was measured by venous plethysmography . RESULTS Baseline MSNA was greater in heart failure patients compared with normal controls ; MSNA was uniformly decreased after exercise training in heart failure patients ( 60 + /- 3 vs. 38 + /- 3 bursts/100 heart beats ) , and the mean difference in the change was significantly ( p < 0.05 ) greater than the mean difference in the change in sedentary heart failure or trained normal controls . In fact , resting MSNA in trained heart failure patients was no longer significantly greater than in trained normal controls . In heart failure patients , peak VO(2 ) and forearm blood flow , but not left ventricular ejection fraction , increased after training . CONCLUSIONS These findings demonstrate that exercise training in heart failure patients results in dramatic reductions in directly recorded resting sympathetic nerve activity . In fact , MSNA was no longer greater than in trained , healthy controls The purpose of this pilot study was to test the adjunctive effects of a 12-week exercise training intervention vs. st and ard pharmacologic therapy on quality of life , functional status , and mood in heart failure patients . A r and omized , two-group repeated measures design was used to test outcomes at baseline and 12 weeks in 23 subjects ( ejection fraction < or=40 % , st and ard pharmacologic therapy [ diuretics , angiotensin-converting enzyme inhibitors , b blockers , and digoxin ] and no change in medical therapy for 30 days ) . The exercise group had significantly higher adjusted means on the role physical , role emotional , and mental functioning subscales of the Medical Outcomes Study 36-item Short-Form Health Survey compared with the control group . Confusion/bewilderment ( Profile of Mood States subscale ) adjusted mean scores were significantly lower for the exercise group , indicating better mood compared with the control group . Exercise training provided adjunctive benefit in terms of role and mental functioning for these heart failure patients
11,036
29,329,598
Compare ramucirumab and bevacizumab in combination with traditional chemotherapy ; ramucirumab has shown to improve progression-free survival and overall survival . Apatinib tyrosine kinase inhibitor combined with traditional chemotherapy has shown to improve overall response rate and progression-free survival with marginal improvements in overall survival . Chemotherapy , in combination with anti-VEGF drugs , in the management of advanced gastric cancer significantly improves the outcome of overall response rate , progression-free survival and overall survival when compared to chemotherapy alone .
Objectives Advanced gastric cancer poses a therapeutic challenge worldwide . In r and omised clinical trials , anti-VEGF has been reported as an essential agent for the treatment of advanced gastric cancer . This review aims at assessing the treatment outcome of anti-angiogenesis therapy through the VEGF pathway in the management of patients with advanced gastric cancer .
BACKGROUND VEGFR-2 has a role in gastric cancer pathogenesis and progression . We assessed whether ramucirumab , a monoclonal antibody VEGFR-2 antagonist , in combination with paclitaxel would increase overall survival in patients previously treated for advanced gastric cancer compared with placebo plus paclitaxel . METHODS This r and omised , placebo-controlled , double-blind , phase 3 trial was done at 170 centres in 27 countries in North and South America , Europe , Asia , and Australia . Patients aged 18 years or older with advanced gastric or gastro-oesophageal junction adenocarcinoma and disease progression on or within 4 months after first-line chemotherapy ( platinum plus fluoropyrimidine with or without an anthracycline ) were r and omly assigned with a central ised interactive voice or web-response system in a 1:1 ratio to receive ramucirumab 8 mg/kg or placebo intravenously on days 1 and 15 , plus paclitaxel 80 mg/m(2 ) intravenously on days 1 , 8 , and 15 of a 28-day cycle . A permuted block r and omisation , stratified by geographic region , time to progression on first-line therapy , and disease measurability , was used . The primary endpoint was overall survival . Efficacy analysis was by intention to treat , and safety analysis included all patients who received at least one treatment with study drug . This trial is registered with Clinical Trials.gov , number NCT01170663 , and has been completed ; patients who are still receiving treatment are in the extension phase . FINDINGS Between Dec 23 , 2010 , and Sept 23 , 2012 , 665 patients were r and omly assigned to treatment-330 to ramucirumab plus paclitaxel and 335 to placebo plus paclitaxel . Overall survival was significantly longer in the ramucirumab plus paclitaxel group than in the placebo plus paclitaxel group ( median 9·6 months [ 95 % CI 8·5 - 10·8 ] vs 7·4 months [ 95 % CI 6·3 - 8·4 ] , hazard ratio 0·807 [ 95 % CI 0·678 - 0·962 ] ; p=0·017 ) . Grade 3 or higher adverse events that occurred in more than 5 % of patients in the ramucirumab plus paclitaxel group versus placebo plus paclitaxel included neutropenia ( 133 [ 41 % ] of 327 vs 62 [ 19 % ] of 329 ) , leucopenia ( 57 [ 17 % ] vs 22 [ 7 % ] ) , hypertension ( 46 [ 14 % ] vs eight [ 2 % ] ) , fatigue ( 39 [ 12 % ] vs 18 [ 5 % ] ) , anaemia ( 30 [ 9 % ] vs 34 [ 10 % ] ) , and abdominal pain ( 20 [ 6 % ] vs 11 [ 3 % ] ) . The incidence of grade 3 or higher febrile neutropenia was low in both groups ( ten [ 3 % ] vs eight [ 2 % ] ) . INTERPRETATION The combination of ramucirumab with paclitaxel significantly increases overall survival compared with placebo plus paclitaxel , and could be regarded as a new st and ard second-line treatment for patients with advanced gastric cancer . FUNDING Eli Lilly and Company Summary Purpose . This phase II , open-label , multicenter study assessed the oral , multitargeted , tyrosine kinase inhibitor sunitinib in patients with advanced gastric or gastroesophageal junction adenocarcinoma who had received prior chemotherapy . Experimental design . Patients received sunitinib 50 mg/day on Schedule 4/2 ( 4 weeks on treatment , followed by 2 weeks off treatment ) . The primary endpoint was objective response rate ; secondary endpoints included clinical benefit rate , duration of response , progression-free survival ( PFS ) , overall survival ( OS ) , pharmacokinetics , pharmacodynamics , safety and tolerability , and quality of life . Results . Of 78 patients enrolled , most had gastric adenocarcinoma ( 93.6 % ) and metastatic disease ( 93.6 % ) . All were evaluable for safety and efficacy . Two patients ( 2.6 % ) had partial responses and 25 patients ( 32.1 % ) had a best response of stable disease for ≥6 weeks . Median PFS was 2.3 months ( 95 % confidence interval [ CI ] , 1.6–2.6 months ) and median OS was 6.8 months ( 95 % CI , 4.4–9.6 months ) . Grade ≥3 thrombocytopenia and neutropenia were reported in 34.6 % and 29.4 % of patients , respectively , and the most common non-hematologic adverse events were fatigue , anorexia , nausea , diarrhea , and stomatitis . Pharmacokinetics of sunitinib and its active metabolite were consistent with previous reports . There were no marked associations between baseline soluble protein levels , or changes from baseline , and measures of clinical outcome . Conclusions . The progression-delaying effect and manageable toxicity observed with sunitinib in this study suggest that although single-agent sunitinib has insufficient clinical value as second-line treatment for advanced gastric cancer , its role in combination with chemotherapy merits further study Background YN968D1 ( Apatinib ) selectively inhibits phosphorylation of VEGFR-2 and tumor angiogenesis in mice model . The study was conducted to determine the maximum tolerated dose ( MTD ) , safety profile , pharmacokinetic variables , and antitumor activity in advanced solid malignancies . Methods This dose-escalation study was conducted according to the Chinese State Food and Drug Administration ( S FDA ) recommendations in patients with advanced solid tumors to determine the MTD for orally administered apatinib . Doses of continuously administered apatinib were escalated from 250 mg . Treatment continued after dose-escalation phase until withdrawal of consent , intolerable toxicities , disease progression or death . Results Forty-six patients were enrolled . Hypertension and h and -foot syndrome were the two dose-limiting toxicities noted at dose level of 1000 mg . MTD was determined to be 850 mg once daily . Pharmacokinetic analysis showed early absorption with a half-life of 9 hours . The mean half-life was constant over all dose groups . Steady-state conditions analysis suggested no accumulation during 56 days of once-daily administration . The most frequently observed drug-related adverse events were hypertension ( 69.5 % , 29 grade 1 - 2 and 3 grade 3 - 4 ) , proteinuria ( 47.8 % , 16 grade 1 - 2 and 6 grade 3 - 4 ) , and h and -foot syndrome ( 45.6 % , 15 grade 1 - 2 and 6 grade 3 - 4 ) . Among the thirty-seven evaluable patients , PR was noted in seven patients ( 18.9 % ) , SD 24 ( 64.9 % ) , with a disease control rate of 83.8 % at 8 weeks . Conclusions The recommended dose of 750 mg once daily was well tolerated . Encouraging antitumor activity across a broad range of malignancies warrants further evaluation in selected population s . Trial registration Clinical Trials.gov unique identifier : PURPOSE The first planned interim analysis ( median follow-up , 3 years ) of the Adjuvant Chemotherapy Trial of S-1 for Gastric Cancer confirmed that the oral fluoropyrimidine derivative S-1 significantly improved overall survival , the primary end point . The results were therefore opened at the recommendation of an independent data and safety monitoring committee . We report 5-year follow-up data on patients enrolled onto the ACTS-GC study . PATIENTS AND METHODS Patients with histologically confirmed stage II or III gastric cancer who underwent gastrectomy with D2 lymphadenectomy were r and omly assigned to receive S-1 after surgery or surgery only . S-1 ( 80 to 120 mg per day ) was given for 4 weeks , followed by 2 weeks of rest . This 6-week cycle was repeated for 1 year . The primary end point was overall survival , and the secondary end points were relapse-free survival and safety . RESULTS The overall survival rate at 5 years was 71.7 % in the S-1 group and 61.1 % in the surgery-only group ( hazard ratio [ HR ] , 0.669 ; 95 % CI , 0.540 to 0.828 ) . The relapse-free survival rate at 5 years was 65.4 % in the S-1 group and 53.1 % in the surgery-only group ( HR , 0.653 ; 95 % CI , 0.537 to 0.793 ) . Subgroup analyses according to principal demographic factors such as sex , age , disease stage , and histologic type showed no interaction between treatment and any characteristic . CONCLUSION On the basis of 5-year follow-up data , postoperative adjuvant therapy with S-1 was confirmed to improve overall survival and relapse-free survival in patients with stage II or III gastric cancer who had undergone D2 gastrectomy Angiogenesis is an important process in cell development , especially in cancer . Vascular endothelial growth factor ( VEGF ) signaling is an important regulator of angiogenesis . Several therapies that act against VEGF signal transduction have been developed , including YN968D1 , which is a potent inhibitor of the VEGF signaling pathway . This study investigated the antitumor activity of YN968D1 ( apatinib mesylate ) in vitro and in vivo . YN968D1 potently suppressed the kinase activities of VEGFR‐2 , c‐kit and c‐src , and inhibited cellular phosphorylation of VEGFR‐2 , c‐kit and PDGFRβ . YN968D1 effectively inhibited proliferation , migration and tube formation of human umbilical vein endothelial cells induced by FBS , and blocked the budding of rat aortic ring . In vivo , YN968D1 alone and in combination with chemotherapeutic agents effectively inhibited the growth of several established human tumor xenograft models with little toxicity . A phase I study of YN968D1 has shown encouraging antitumor activity and a manageable toxicity profile . These findings suggest that YN968D1 has promise as an antitumor drug and might have clinical benefits . ( Cancer Sci 2011 ; 102 : 1374–1380 BACKGROUND A regimen of epirubicin , cisplatin , and infused fluorouracil ( ECF ) improves survival among patients with incurable locally advanced or metastatic gastric adenocarcinoma . We assessed whether the addition of a perioperative regimen of ECF to surgery improves outcomes among patients with potentially curable gastric cancer . METHODS We r and omly assigned patients with resectable adenocarcinoma of the stomach , esophagogastric junction , or lower esophagus to either perioperative chemotherapy and surgery ( 250 patients ) or surgery alone ( 253 patients ) . Chemotherapy consisted of three preoperative and three postoperative cycles of intravenous epirubicin ( 50 mg per square meter of body-surface area ) and cisplatin ( 60 mg per square meter ) on day 1 , and a continuous intravenous infusion of fluorouracil ( 200 mg per square meter per day ) for 21 days . The primary end point was overall survival . RESULTS ECF-related adverse effects were similar to those previously reported among patients with advanced gastric cancer . Rates of postoperative complications were similar in the perioperative-chemotherapy group and the surgery group ( 46 percent and 45 percent , respectively ) , as were the numbers of deaths within 30 days after surgery . The resected tumors were significantly smaller and less advanced in the perioperative-chemotherapy group . With a median follow-up of four years , 149 patients in the perioperative-chemotherapy group and 170 in the surgery group had died . As compared with the surgery group , the perioperative-chemotherapy group had a higher likelihood of overall survival ( hazard ratio for death , 0.75 ; 95 percent confidence interval , 0.60 to 0.93 ; P=0.009 ; five-year survival rate , 36 percent vs. 23 percent ) and of progression-free survival ( hazard ratio for progression , 0.66 ; 95 percent confidence interval , 0.53 to 0.81 ; P<0.001 ) . CONCLUSIONS In patients with operable gastric or lower esophageal adenocarcinomas , a perioperative regimen of ECF decreased tumor size and stage and significantly improved progression-free and overall survival . ( Current Controlled Trials number , IS RCT N93793971 [ controlled-trials.com ] . )
11,037
24,026,259
Hypoglycemic risk was similar to that of other agents . Results for cardiovascular outcomes and death were inconclusive . An imbalance in incidence of bladder and breast cancer was noted with dapagliflozin compared with control . Sodium-glucose cotransporter 2 inhibitors may improve short-term outcomes in adults with type 2 diabetes , but effects on long-term outcomes and safety are unclear .
BACKGROUND Sodium-glucose cotransporter 2 ( SGLT2 ) inhibitors are a new class of antidiabetic drugs . PURPOSE To assess the efficacy and safety of SGLT2 inhibitors in adults with type 2 diabetes .
Abbreviations ACCORD Action to Control Cardiovascular Risk in Diabetes ADVANCE Action in Diabetes and Vascular Disease : Preterax and Diamicron Modified-Release Controlled Evaluation AGI α-Glucosidase inhibitor CAD Coronary artery disease CKD Chronic kidney disease CVD Cardiovascular disease DPP-4 Dipeptidyl peptidase IV GIP Glucose-dependent insulinotropic peptide GLP-1 Glucagon-like peptide 1 NPH Neutral protamine Hagedorn TZD Thiazolidinedione UKPDS UK Prospect i ve Diabetes Study VADT Veterans Affairs Diabetes OBJECTIVE To evaluate the effects of canagliflozin , a sodium-glucose cotransporter 2 inhibitor , in type 2 diabetes mellitus inadequately controlled with metformin monotherapy . RESEARCH DESIGN AND METHODS This was a double-blind , placebo-controlled , parallel-group , multicenter , dose-ranging study in 451 subjects r and omized to canagliflozin 50 , 100 , 200 , or 300 mg once daily ( QD ) or 300 mg twice daily ( BID ) , sitagliptin 100 mg QD , or placebo . Primary end point was change in A1C from baseline through week 12 . Secondary end points included change in fasting plasma glucose ( FPG ) , body weight , and overnight urinary glucose-to-creatinine ratio . Safety and tolerability were also assessed . RESULTS Canagliflozin was associated with significant reductions in A1C from baseline ( 7.6–8.0 % ) to week 12 : −0.79 , −0.76 , −0.70 , −0.92 , and −0.95 % for canagliflozin 50 , 100 , 200 , 300 mg QD and 300 mg BID , respectively , versus −0.22 % for placebo ( all P < 0.001 ) and −0.74 % for sitagliptin . FPG was reduced by −16 to −27 mg/dL , and body weight was reduced by −2.3 to −3.4 % , with significant increases in urinary glucose-to-creatinine ratio . Adverse events were transient , mild to moderate , and balanced across arms except for a non – dose-dependent increase in symptomatic genital infections with canagliflozin ( 3–8 % ) versus placebo and sitagliptin ( 2 % ) . Urinary tract infections were reported without dose dependency in 3–9 % of canagliflozin , 6 % of placebo , and 2 % of sitagliptin arms . Overall incidence of hypoglycemia was low . CONCLUSIONS Canagliflozin added onto metformin significantly improved glycemic control in type 2 diabetes and was associated with low incidence of hypoglycemia and significant weight loss . The safety/tolerability profile of canagliflozin was favorable except for increased frequency of genital infections in females CONTEXT Dapagliflozin , a selective sodium-glucose cotransporter 2 ( SGLT2 ) inhibitor , reduces hyperglycemia in patients with type 2 diabetes mellitus ( T2DM ) by increasing urinary glucose excretion , and weight loss is a consistent associated finding . OBJECTIVES Our objectives were to confirm weight loss with dapagliflozin and establish through body composition measurements whether weight loss is accounted for by changes in fat or fluid components . DESIGN AND SETTING This was a 24-wk , international , multicenter , r and omized , parallel-group , double-blind , placebo-controlled study with ongoing 78-wk site- and patient-blinded extension period at 40 sites in five countries . PATIENTS Included were 182 patients with T2DM ( mean values : women 63.3 and men 58.6 yr of age ; hemoglobin A1c 7.17 % , body mass index 31.9 kg/m2 , and body weight 91.5 kg ) inadequately controlled on metformin . INTERVENTION Dapagliflozin 10 mg/d or placebo was added to open-label metformin for 24 wk . MAIN OUTCOME MEASURES Primary endpoint was total body weight ( TBW ) change from baseline at wk 24 . Key secondary endpoints were waist circumference and dual-energy x-ray absorptiometry total-body fat mass ( FM ) changes from baseline at wk 24 , and patient proportion achieving body weight reduction of at least 5 % at wk 24 . In a subset of patients , magnetic resonance assessment of visceral adipose tissue ( VAT ) and sc adipose tissue ( SAT ) volume and hepatic lipid content were also evaluated . RESULTS At wk 24 , placebo-corrected changes with dapagliflozin were as follows : TBW , -2.08 kg [ 95 % confidence interval (CI)=-2.84 to -1.31 ; P<0.0001 ] ; waist circumference , -1.52 cm ( 95 % CI=-2.74 to -0.31 ; P=0.0143 ) ; FM , -1.48 kg ( 95 % CI=-2.22 to -0.74 ; P=0.0001 ) ; proportion of patients achieving weight reduction of at least 5 % , + 26.2 % ( 95 % CI=15.5 to 36.7 ; P<0.0001 ) ; VAT , -258.4 cm3 ( 95 % CI=-448.1 to -68.6 ; nominal P=0.0084 ) ; SAT , -184.9 cm3 ( 95 % CI=-359.7 to -10.1 ; nominal P=0.0385 ) . In the dapagliflozin vs. placebo groups , respectively , serious adverse events were reported in 6.6 vs. 1.1 % ; events suggestive of vulvovaginitis , balanitis , and related genital infection in 3.3 vs. 0 % ; and lower urinary tract infections in 6.6 vs. 2.2 % . CONCLUSIONS Dapagliflozin reduces TBW , predominantly by reducing FM , VAT and SAT in T2DM inadequately controlled with metformin OBJECTIVE Dapagliflozin , a highly selective inhibitor of the renal sodium-glucose cotransporter-2 , increases urinary excretion of glucose and lowers plasma glucose levels in an insulin-independent manner . We evaluated the efficacy and safety of dapagliflozin in treatment-naive patients with type 2 diabetes . RESEARCH DESIGN AND METHODS This was a 24-week parallel-group , double-blind , placebo-controlled phase 3 trial . Patients with A1C 7.0–10 % ( n = 485 ) were r and omly assigned to one of seven arms to receive once-daily placebo or 2.5 , 5 , or 10 mg dapagliflozin once daily in the morning ( main cohort ) or evening ( exploratory cohort ) . Patients with A1C 10.1–12 % ( high-A1C exploratory cohort ; n = 73 ) were r and omly assigned 1:1 to receive blinded treatment with a morning dose of 5 or 10 mg/day dapagliflozin . The primary end point was change from baseline in A1C in the main cohort , statistically tested using an ANCOVA . RESULTS In the main cohort , mean A1C changes from baseline at week 24 were −0.23 % with placebo and −0.58 , −0.77 ( P = 0.0005 vs. placebo ) , and −0.89 % ( P < 0.0001 vs. placebo ) with 2.5 , 5 , and 10 mg dapagliflozin , respectively . Signs , symptoms , and other reports suggestive of urinary tract infections and genital infection were more frequently noted in the dapagliflozin arms . There were no major episodes of hypoglycemia . Data from exploratory cohorts were consistent with these results . CONCLUSIONS Dapagliflozin lowered hyperglycemia in treatment-naive patients with newly diagnosed type 2 diabetes . The near absence of hypoglycemia and an insulin-independent mechanism of action make dapagliflozin a unique addition to existing treatment options for type 2 diabetes Overwhelming evidence shows the quality of reporting of r and omised controlled trials ( RCTs ) is not optimal . Without transparent reporting , readers can not judge the reliability and validity of trial findings nor extract information for systematic review s. Recent method ological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects . Such systematic error is seriously damaging to RCTs , which are considered the gold st and ard for evaluating interventions because of their ability to minimise or avoid bias . A group of scientists and editors developed the CONSORT ( Consoli date d St and ards of Reporting Trials ) statement to improve the quality of reporting of RCTs . It was first published in 1996 and up date d in 2001 . The statement consists of a checklist and flow diagram that authors can use for reporting an RCT . Many leading medical journals and major international editorial groups have endorsed the CONSORT statement . The statement facilitates critical appraisal and interpretation of RCTs . During the 2001 CONSORT revision , it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports . A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement . After an expert meeting in January 2007 , the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement . This up date improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition , such as selective outcome reporting bias . This explanatory and elaboration document-intended to enhance the use , underst and ing , and dissemination of the CONSORT statement-has also been extensively revised . It presents the meaning and rationale for each new and up date d checklist item providing examples of good reporting and , where possible , references to relevant empirical studies . Several examples of flow diagrams are included . The CONSORT 2010 Statement , this revised explanatory and elaboration document , and the associated website ( www.consort-statement.org ) should be helpful re sources to improve reporting of r and omised trials OBJECTIVE To examine the safety and efficacy of dapagliflozin , a sodium-glucose cotransporter-2 inhibitor , added on to pioglitazone in type 2 diabetes inadequately controlled on pioglitazone . RESEARCH DESIGN AND METHODS Treatment-naive patients or those receiving metformin , sulfonylurea , or thiazolidinedione entered a 10-week pioglitazone dose-optimization period with only pioglitazone . They were then r and omized , along with patients previously receiving pioglitazone ≥30 mg , to 48 weeks of double-blind dapagliflozin 5 ( n = 141 ) or 10 mg ( n = 140 ) or placebo ( n = 139 ) every day plus open-label pioglitazone . The primary objective compared HbA1c change from baseline with dapagliflozin plus pioglitazone versus placebo plus pioglitazone at week 24 . Primary analysis was based on ANCOVA model using last observation carried forward ; all remaining analyses used repeated- measures analysis . RESULTS At week 24 , the mean reduction from baseline in HbA1c was −0.42 % for placebo versus −0.82 and −0.97 % for dapagliflozin 5 and 10 mg groups , respectively ( P = 0.0007 and P < 0.0001 versus placebo ) . Patients receiving pioglitazone alone had greater weight gain ( 3 kg ) than those receiving dapagliflozin plus pioglitazone ( 0.7–1.4 kg ) at week 48 . Through 48 weeks : hypoglycemia was rare ; more events suggestive of genital infection were reported with dapagliflozin ( 8.6–9.2 % ) than placebo ( 2.9 % ) ; events suggestive of urinary tract infection showed no clear drug effect ( 5.0–8.5 % for dapagliflozin and 7.9 % for placebo ) ; dapagliflozin plus pioglitazone groups had less edema ( 2.1–4.3 % ) compared with placebo plus pioglitazone ( 6.5 % ) ; and congestive heart failure and fractures were rare . CONCLUSIONS In patients with type 2 diabetes inadequately controlled on pioglitazone , the addition of dapagliflozin further reduced HbA1c levels and mitigated the pioglitazone-related weight gain without increasing hypoglycemia risk Background Despite the number of medications for type 2 diabetes , many people with the condition do not achieve good glycaemic control . Some existing glucose-lowering agents have adverse effects such as weight gain or hypoglycaemia . Type 2 diabetes tends to be a progressive disease , and most patients require treatment with combinations of glucose-lowering agents . The sodium glucose co-transporter 2 ( SGLT2 ) receptor inhibitors are a new class of glucose-lowering agents . Objective To assess the clinical effectiveness and safety of the SGLT2 receptor inhibitors in dual or triple therapy in type 2 diabetes . Data sources MEDLINE , Embase , Cochrane Library ( all sections ) ; Science Citation Index ; trial registries ; conference abstract s ; drug regulatory authorities ; bibliographies of retrieved papers . Inclusion criteria R and omised controlled trials of SGLT2 receptor inhibitors compared with placebo or active comparator in type 2 diabetes in dual or combination therapy . Methods Systematic review . Quality assessment used the Cochrane risk of bias score . Results Seven trials , published in full , assessed dapagliflozin and one assessed canagliflozin . Trial quality appeared good . Dapagliflozin 10 mg reduced HbA1c by −0.54 % ( weighted mean differences ( WMD ) , 95 % CI −0.67 to −0.40 ) compared to placebo , but there was no difference compared to glipizide . Canagliflozin reduced HbA1c slightly more than sitagliptin ( up to −0.21 % vs sitagliptin ) . Both dapagliflozin and canagliflozin led to weight loss ( dapagliflozin WMD −1.81 kg ( 95 % CI −2.04 to −1.57 ) , canagliflozin up to −2.3 kg compared to placebo ) . Limitations Long-term trial extensions suggested that effects were maintained over time . Data on canagliflozin are currently available from only one paper . Costs of the drugs are not known so cost-effectiveness can not be assessed . More data on safety are needed , with the Food and Drug Administration having concerns about breast and bladder cancers . Conclusions Dapagliflozin appears effective in reducing HbA1c and weight in type 2 diabetes , although more safety data are needed Dapagliflozin , administered to patients in once‐daily oral doses , is a sodium – glucose cotransporter 2 ( SGLT2 ) inhibitor that blocks the reabsorption of glucose from urine into the blood . This 14‐day study r and omized patients with type 2 diabetes mellitus ( T2DM ) to four treatment groups receiving daily oral doses of 5‐ , 25‐ , or 100‐mg doses of dapagliflozin or placebo , in order to evaluate glucosuria and glycemic parameters . Significant reductions in fasting serum glucose ( FSG ) were observed on day 2 with 100 mg dapagliflozin ( −9.3 % , P < 0.001 ) , and dose‐dependent reductions were observed on day 13 with the 5‐mg ( −11.7 % ; P < 0.05 ) , 25‐mg ( −13.3 % ; P < 0.05 ) , and 100‐mg ( −21.8 % ; P < 0.0001 ) doses as compared with placebo . Significant improvements in oral glucose tolerance test ( OGTT ) were observed with all doses on days 2 and 13 ( P < 0.001 as compared with placebo ) . On day 14 , urine glucose values were 36.6 , 70.1 , and 69.9 g/day for the 5‐ , 25‐ , and 100‐mg doses ( as compared with no change for placebo ) , which were slightly lower than those on day 1 . This was attributed to the decrease in filtered glucose load following improved glycemic control . Dapagliflozin produced dose‐dependent increases in glucosuria and clinical ly meaningful changes in glycemic parameters in T2DM patients BACKGROUND Correction of hyperglycaemia and prevention of glucotoxicity are important objectives in the management of type 2 diabetes . Dapagliflozin , a selective sodium-glucose cotransporter-2 inhibitor , reduces renal glucose reabsorption in an insulin-independent manner . We assessed the efficacy and safety of dapagliflozin in patients who have inadequate glycaemic control with metformin . METHODS In this phase 3 , multicentre , double-blind , parallel-group , placebo-controlled trial , 546 adults with type 2 diabetes who were receiving daily metformin ( > /=1500 mg per day ) and had inadequate glycaemic control were r and omly assigned to receive one of three doses of dapagliflozin ( 2.5 mg , n=137 ; 5 mg , n=137 ; or 10 mg , n=135 ) or placebo ( n=137 ) orally once daily . R and omisation was computer generated and stratified by site , implemented with a central , telephone-based interactive voice response system . Patients continued to receive their pre- study metformin dosing . The primary outcome was change from baseline in haemoglobin A(1c)(HbA(1c ) ) at 24 weeks . All r and omised patients who received at least one dose of double-blind study medication and who had both a baseline and at least one post-baseline measurement ( last observation carried forward ) were included in the analysis . Data were analysed by use of ANCOVA models . This trial is registered with Clinical Trials.gov , number NCT00528879 . FINDINGS 534 patients were included in analysis of the primary endpoint ( dapagliflozin 2.5 mg , n=135 ; dapagliflozin 5 mg , n=133 ; dapagliflozin 10 mg , n=132 ; placebo , n=134 ) . At week 24 , mean HbA(1c ) had decreased by -0.30 % ( 95 % CI -0.44 to -0.16 ) in the placebo group , compared with -0.67 % ( -0.81 to -0.53 , p=0.0002 ) in the dapagliflozin 2.5 mg group , -0.70 % ( -0.85 to -0.56 , p<0.0001 ) in the dapagliflozin 5 mg group , and -0.84 % ( -0.98 to -0.70 , p<0.0001 ) in the dapagliflozin 10 mg group . Symptoms of hypoglycaemia occurred in similar proportions of patients in the dapagliflozin ( 2 - 4 % ) and placebo groups ( 3 % ) . Signs , symptoms , and other reports suggestive of genital infections were more frequent in the dapagliflozin groups ( 2.5 mg , 11 patients [ 8 % ] ; 5 mg , 18 [ 13 % ] ; 10 mg , 12 [ 9 % ] ) than in the placebo group ( seven [ 5 % ] ) . 17 patients had serious adverse events ( four in each of the dapagliflozin groups and five in the placebo group ) . INTERPRETATION Addition of dapagliflozin to metformin provides a new therapeutic option for treatment of type 2 diabetes in patients who have inadequate glycaemic control with metformin alone . FUNDING Bristol-Myers Squibb and AstraZeneca Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more BACKGROUND Combining metformin ( XR ) with dapagliflozin to initiate pharmacotherapy in patients with type 2 diabetes ( T2D ) and high baseline HbA1c may be advantageous . We conducted two r and omised , double-blind , three-arm 24-week trials in treatment-naïve patients to compare dapagliflozin plus metformin , dapagliflozin alone and metformin alone . METHODS Eligible patients had baseline HbA1c 7.5 - 12 % . Each trial had three arms : dapagliflozin plus metformin , dapagliflozin monotherapy and metformin monotherapy . Dapagliflozin in combination and as monotherapy was dosed at 5 mg ( Study 1 ) and 10 mg ( Study 2 ) . Metformin in combination and as monotherapy was titrated to 2000 mg . The primary endpoint was HbA1c change from baseline ; secondary endpoints included change in fasting plasma glucose ( FPG ) and weight . RESULTS In both trials , combination therapy led to significantly greater reductions in HbA1c compared with either monotherapy : -2.05 for dapagliflozin + metformin , -1.19 for dapagliflozin , and -1.35 for metformin ( p < 0.0001 ) ( Study 1 ) ; -1.98 for dapagliflozin + metformin , -1.45 for dapagliflozin and -1.44 for metformin ( p < 0.0001 ) ( Study 2 ) . Combination therapy was statistically superior to monotherapy in reduction of FPG ( p < 0.0001 for both studies ) ; combination therapy was more effective than metformin for weight reduction ( p < 0.0001 ) . Dapagliflozin 10 mg was non-inferior to metformin in reducing HbA1c ( Study 2 ) . Events suggestive of genital infection were reported in 6.7 % , 6.9 % and 2.0 % ( Study 1 ) and 8.5 % , 12.8 % and 2.4 % ( Study 2 ) of patients in combination , dapagliflozin and metformin groups ; events suggestive of urinary tract infection were reported in 7.7 % , 7.9 % and 7.5 % ( Study 1 ) and 7.6 % , 11.0 % and 4.3 % ( Study 2 ) of patients in the respective groups . No major hypoglycaemia was reported . CONCLUSION In treatment-naïve patients with T2D , dapagliflozin plus metformin was generally well tolerated and effective in reducing HbA1c , FPG and weight . Dapagliflozin-induced glucosuria led to an increase in events suggestive of urinary tract and genital infections Canagliflozin is a sodium glucose co‐transporter 2 inhibitor in development for type 2 diabetes mellitus ( T2DM ) . The efficacy and safety of canagliflozin were evaluated in subjects with T2DM inadequately controlled with diet and exercise AIMS Progressive deterioration of glycaemic control in type 2 diabetes mellitus ( T2DM ) often requires treatment intensification . Dapagliflozin increases urinary glucose excretion by selective inhibition of renal sodium-glucose cotransporter 2 ( SGLT2 ) . We assessed the efficacy , safety and tolerability of dapagliflozin added to glimepiride in patients with uncontrolled T2DM . METHODS This 24-week , r and omized , double-blind , placebo-controlled , parallel-group , international , multicentre trial ( Clinical Trials.gov NCT00680745 ) enrolled patients with uncontrolled T2DM [ haemoglobin A1c ( HbA1c ) 7 - 10 % ] receiving sulphonylurea monotherapy . Adult patients ( n = 597 ) were r and omly assigned to placebo or dapagliflozin ( 2.5 , 5 or 10 mg/day ) added to open-label glimepiride 4 mg/day for 24 weeks . Primary endpoint was HbA1c mean change from baseline at 24 weeks . Secondary endpoints included change in body weight and other glycaemic parameters . RESULTS At 24 weeks , HbA1c adjusted mean changes from baseline for placebo versus dapagliflozin 2.5/5/10 mg groups were -0.13 versus -0.58 , -0.63 , -0.82 % , respectively ( all p < 0.0001 vs. placebo by Dunnett 's procedure ) . Corresponding body weight and fasting plasma glucose values were -0.72 , -1.18 , -1.56 , -2.26 kg and -0.11 , -0.93 , -1.18 , -1.58 mmol/l , respectively . In placebo versus dapagliflozin groups , serious adverse events were 4.8 versus 6.0 - 7.1 % ; hypoglycaemic events 4.8 versus 7.1 - 7.9 % ; events suggestive of genital infection 0.7 versus 3.9 - 6.6 % ; and events suggestive of urinary tract infection 6.2 versus 3.9 - 6.9 % . No kidney infections were reported . CONCLUSIONS Dapagliflozin added to glimepiride in patients with T2DM uncontrolled on sulphonylurea monotherapy significantly improved HbA1c , reduced weight and was generally well tolerated , although events suggestive of genital infections were reported more often in patients receiving dapagliflozin OBJECTIVE To determine whether dapagliflozin , which selectively inhibits renal glucose reabsorption , lowers hyperglycemia in patients with type 2 diabetes that is poorly controlled with high insulin doses plus oral antidiabetic agents ( OADs ) . RESEARCH DESIGN AND METHODS This was a r and omized , double-blind , three-arm parallel-group , placebo-controlled , 26-center trial ( U.S. and Canada ) . Based on data from an insulin dose-adjustment setting cohort ( n = 4 ) , patients in the treatment cohort ( n = 71 ) were r and omly assigned 1:1:1 to placebo , 10 mg dapagliflozin , or 20 mg dapagliflozin , plus OAD(s ) and 50 % of their daily insulin dose . The primary outcome was change from baseline in A1C at week 12 ( dapagliflozin vs. placebo , last observation carried forward [ LOCF ] ) . RESULTS At week 12 ( LOCF ) , the 10- and 20-mg dapagliflozin groups demonstrated −0.70 and −0.78 % mean differences in A1C change from baseline versus placebo . In both dapagliflozin groups , 65.2 % of patients achieved a decrease from baseline in A1C ≥0.5 % versus 15.8 % in the placebo group . Mean changes from baseline in fasting plasma glucose ( FPG ) were + 17.8 , + 2.4 , and −9.6 mg/dl ( placebo , 10 mg dapagliflozin , and 20 mg dapagliflozin , respectively ) . Postpr and ial glucose ( PPG ) reductions with dapagliflozin also showed dose dependence . Mean changes in total body weight were −1.9 , −4.5 , and −4.3 kg ( placebo , 10 mg dapagliflozin , and 20 mg dapagliflozin ) . Overall , adverse events were balanced across all groups , although more genital infections occurred in the 20-mg dapagliflozin group than in the placebo group . CONCLUSIONS In patients receiving high insulin doses plus insulin sensitizers who had their baseline insulin reduced by 50 % , dapagliflozin decreased A1C , produced better FPG and PPG levels , and lowered weight more than placebo Background Management of type 2 diabetes with metformin often does not provide adequate glycemic control , thereby necessitating add-on treatment . In a 24-week clinical trial , dapagliflozin , an investigational sodium glucose cotransporter 2 inhibitor , improved glycemic control in patients inadequately controlled with metformin . The present study is an extension that was undertaken to evaluate dapagliflozin as long-term therapy in this population . Methods This was a long-term extension ( total 102 weeks ) of a 24-week phase 3 , multicenter , r and omized , placebo-controlled , double-blind , parallel-group trial . Patients were r and omly assigned ( 1:1:1:1 ) to blinded daily treatment ( placebo , or dapagliflozin 2.5 to 5 , or 10 mg ) plus open-label metformin ( ≥1,500 mg ) . The previously published primary endpoint was change from baseline in glycated hemoglobin ( HbA1c ) at 24 weeks . This paper reports the follow-up to week 102 , with analysis of covariance model performed at 24 weeks with last observation carried forward ; a repeated measures analysis was utilized to evaluate changes from baseline in HbA1c , fasting plasma glucose ( FPG ) , and weight . Results A total of 546 patients were r and omized to 1 of the 4 treatments . The completion rate for the 78-week double-blind extension period was lower for the placebo group ( 63.5 % ) than for the dapagliflozin groups ( 68.3 % to 79.8 % ) . At week 102 , mean changes from baseline HbA1c ( 8.06 % ) were + 0.02 % for placebo compared with -0.48 % ( P = 0.0008 ) , -0.58 % ( P < 0.0001 ) , and -0.78 % ( P < 0.0001 ) for dapagliflozin 2.5 to 5 , and 10 mg , respectively . In addition , all dapagliflozin groups had sustained reductions from baseline in FPG ( -1.07 to -1.47 mmol/l ) and body weight ( -1.10 to -1.74 kg ) at 102 weeks , whereas increases were noted in placebo-treated patients for both of these outcomes . Events of hypoglycemia were rare and were not severe . Evidence suggestive of genital infection was reported in 11.7 % to 14.6 % of dapagliflozin patients and 5.1 % of placebo patients , with one related discontinuation ( dapagliflozin 5 mg ) . Evidence suggestive of urinary tract infection was reported in 8.0 % to 13.3 % of dapagliflozin patients and 8.0 % of placebo patients , with one related discontinuation ( dapagliflozin 2.5 mg ) . Conclusions Dapagliflozin added to metformin for 102 weeks enabled sustained reductions in HbA1c , FPG , and weight without increased risk of hypoglycemia in patients with type 2 diabetes who were inadequately controlled on metformin alone . Trial registration Clinical Trials.gov : Canagliflozin is a sodium glucose co‐transporter 2 inhibitor in development for treatment of type 2 diabetes mellitus ( T2DM ) . This study evaluated the efficacy and safety of canagliflozin in subjects with T2DM and stage 3 chronic kidney disease [ CKD ; estimated glomerular filtration rate ( eGFR ) ≥30 and < 50 ml/min/1.73 m2 ] AIMS Dapagliflozin , a selective sodium-glucose cotransporter 2 ( SGLT2 ) inhibitor , reduces hyperglycaemia in patients with type 2 diabetes ( T2DM ) by increasing urinary glucose excretion . Owing to its mechanism of action , dapagliflozin could potentially affect the renal tubular transportation of bone minerals . Therefore , markers of bone formation and resorption and bone mineral density ( BMD ) were evaluated in patients with T2DM after 50 weeks of dapagliflozin treatment . METHODS This international , multi-centre , r and omized , parallel-group , double-blind , placebo-controlled study ( Clinical Trials.gov NCT00855166 ) enrolled patients with T2DM ( women 55 - 75 years and men 30 - 75 years ; HbA1c 6.5 - 8.5 % ; BMI ≥ 25 kg/m(2 ) ; body weight ≤ 120 kg ) whose T2DM was inadequately controlled on metformin . One hundred and eighty-two patients were r and omly assigned 1:1 to receive dapagliflozin 10 mg/day or placebo added to open-label metformin for a 24-week double-blind treatment period followed by a 78-week site- and patient-blinded extension period . At week 50 , serum markers of bone formation ( procollagen type 1 N-terminal propeptide ; P1NP ) and resorption ( C-terminal cross-linking telopeptides of type I collagen ; CTX ) , bone mineral density ( BMD ) as assessed by st and ardized Dual-Energy X-ray Absorptiometry ( DXA ) measurements and adverse events of fracture were evaluated as safety objectives . RESULTS One hundred and sixty-five patients ( 90.7 % ) completed the first 50 weeks . Compared with placebo , no significant changes from baseline in P1NP , CTX or BMD were identified over 50 weeks of dapagliflozin treatment , with no significant treatment-by-gender interactions . No fractures were reported . CONCLUSIONS Dapagliflozin had no effect on markers of bone formation and resorption or BMD after 50 weeks of treatment in both male and post-menopausal female patients whose T2DM was inadequately controlled on metformin AIM This r and omized , double-blind , placebo-controlled parallel-group study assessed the effects of sodium glucose cotransporter 2 inhibition by dapagliflozin on insulin sensitivity and secretion in subjects with type 2 diabetes mellitus ( T2DM ) , who had inadequate glycemic control with metformin ( with or without an insulin secretagogue ) . SUBJECTS AND METHODS Forty-four subjects were r and omized to receive dapagliflozin 5 mg or matching placebo once daily for 12 weeks . Subjects continued stable doses of background antidiabetes medication throughout the study . Insulin sensitivity was assessed by measuring the glucose disappearance rate ( GDR ) during the last 40 min of a 5-h hyperinsulinemic , euglycemic clamp . Insulin secretion was determined as the acute insulin response to glucose ( AIRg ) during the first 10 min of a frequently sample d intravenous glucose tolerance test . Where noted , data were adjusted for baseline values and background antidiabetes medication . RESULTS An adjusted mean increase from baseline in GDR ( last observation carried forward ) , at Week 12 , was observed with dapagliflozin ( 7.98 % ) versus a decrease with placebo ( -9.99 % ) . The 19.97 % ( 95 % confidence interval 5.75 - 36.10 ) difference in GDR versus placebo was statistically significant ( P=0.0059 ) . A change from baseline in adjusted mean AIRg of 15.39 mU/L min was observed with dapagliflozin at Week 12 , versus -12.73 mU/L min with placebo ( P=0.0598 ) . Over 12 weeks , numerical reductions from baseline in glycosylated hemoglobin ( HbA1c ) , fasting plasma glucose , and body weight were observed with dapagliflozin ( -0.38 % , -0.39 mmol/L , and -1.58 % , respectively ) versus slight numerical increases with placebo ( 0.03 % , 0.26 mmol/L , and 0.62 % , respectively ) . CONCLUSIONS In patients with T2DM and inadequate glycemic control , dapagliflozin treatment improved insulin sensitivity in the setting of reductions in HbA1c and weight BACKGROUND Dapagliflozin , a selective inhibitor of sodium-glucose cotransporter 2 , may improve glycemic control with a lower dose of insulin and attenuate the associated weight gain in patients with inadequate control despite high doses of insulin . OBJECTIVE To evaluate the efficacy and safety of adding dapagliflozin therapy in patients whose type 2 diabetes mellitus is inadequately controlled with insulin with or without oral antidiabetic drugs . DESIGN A 24-week , r and omized , placebo-controlled , multicenter trial followed by a 24-week extension period . An additional 56-week extension period is ongoing . ( Clinical Trials.gov registration number : NCT00673231 ) SETTING 126 centers in Europe and North America from 30 April 2008 to 19 November 2009 . PATIENTS 808 patients with inadequately controlled type 2 diabetes mellitus receiving at least 30 U of insulin daily , with or without up to 2 oral antidiabetic drugs . INTERVENTION Patients were r and omly assigned in a 1:1:1:1 ratio and allocated with a computer-generated scheme to receive placebo or 2.5 , 5 , or 10 mg of dapagliflozin , once daily , for 48 weeks . MEASUREMENTS The primary outcome was change in hemoglobin A(1c ) from baseline to 24 weeks . Secondary outcomes included changes in body weight , insulin dose , and fasting plasma glucose level at 24 weeks and during the 24-week extension period . Adverse events were evaluated throughout both 24-week periods . RESULTS 800 patients were analyzed . After 24 weeks , mean hemoglobin A(1c ) decreased by 0.79 % to 0.96 % with dapagliflozin compared with 0.39 % with placebo ( mean difference , -0.40 % [ 95 % CI , -0.54 % to -0.25 % ] in the 2.5-mg group , -0.49 % [ CI , -0.65 % to -0.34 % ] in the 5-mg group , and -0.57 % [ CI , -0.72 % to -0.42 % ] in the 10-mg group ) . Daily insulin dose decreased by 0.63 to 1.95 U with dapagliflozin and increased by 5.65 U with placebo ( mean difference , -7.60 U [ CI , -10.32 to -4.87 U ] in the 2.5-mg group , -6.28 U [ CI , -8.99 to -3.58 U ] in the 5-mg group , and -6.82 U [ CI , -9.56 to -4.09 U ] in the 10-mg group ) . Body weight decreased by 0.92 to 1.61 kg with dapagliflozin and increased by 0.43 kg with placebo ( mean differences , -1.35 kg [ CI , -1.90 to -0.80 kg ] in the 2.5-mg group , -1.42 kg [ CI , -1.97 to -0.88 kg ] in the 5-mg group , and -2.04 kg [ CI , -2.59 to -1.48 kg ] in the 10-mg group ) . These effects were maintained at 48 weeks . Compared with the placebo group , patients in the pooled dapagliflozin groups had a higher rate of hypoglycemic episodes ( 56.6 % vs. 51.8 % ) , events suggesting genital infection ( 9.0 % vs. 2.5 % ) , and events suggesting urinary tract infection ( 9.7 % vs. 5.1 % ) . LIMITATION Insulin doses were not titrated to target , and the study was not design ed to evaluate long-term safety . CONCLUSION Dapagliflozin improves glycemic control , stabilizes insulin dosing , and reduces weight without increasing major hypoglycemic episodes in patients with inadequately controlled type 2 diabetes mellitus . PRIMARY FUNDING SOURCE AstraZeneca and Bristol-Myers Squibb AIM Dapagliflozin is a selective sodium-glucose co-transporter 2 ( SGLT2 ) inhibitor under development as a treatment for type 2 diabetes mellitus ( T2DM ) . This study assessed the efficacy and safety of dapagliflozin monotherapy in Japanese T2DM patients with inadequate glycaemic control . METHODS Patients ( n = 279 ) were r and omized to receive dapagliflozin ( 1 , 2.5 , 5 or 10 mg/day ) or placebo once daily for 12 weeks . The primary endpoint was change from baseline in haemoglobin A1c ( HbA1c ) at week 12 . Secondary endpoints included change from baseline in fasting plasma glucose ( FPG ) and proportion of patients achieving HbA1c < 7.0 % at week 12 . RESULTS Significant reductions in HbA1c were seen with all dapagliflozin doses ( -0.11 to -0.44 % ) versus placebo ( + 0.37 % ) . Reductions were also observed in FPG with dapagliflozin ( -0.87 to -1.77 mmol/l [ -15.61 to -31.94 mg/dl ] ) versus placebo ( + 0.62 mmol/l [ + 11.17 mg/dl ] ) . No significant difference in the proportion of patients achieving HbA1c levels < 7.0 % was noted with dapagliflozin versus placebo . Adverse events ( AEs ) were more frequent with dapagliflozin ( 40.7 - 53.8 % ) versus placebo ( 38.9 % ) and were mostly mild/moderate in intensity . Three hypoglycaemic events were reported ( 1 each with placebo , dapagliflozin 2.5 mg and 10 mg ) . The frequency of signs and symptoms suggestive of urinary tract or genital infections was 0 - 3.8 and 0 - 1.8 % respectively with dapagliflozin and 1.9 and 0 % with placebo . No AEs of pyelonephritis were observed . CONCLUSIONS Compared with placebo , dapagliflozin significantly reduced hyperglycaemia over 12 weeks with a low risk of hypoglycaemia in Japanese T2DM patients with inadequate glycaemic control AIMS Remogliflozin etabonate ( RE ) is the pro-drug of remogliflozin ( R ) , a selective inhibitor of renal sodium-dependent glucose transporter 2 ( SGLT2 ) that improves glucose control via enhanced urinary glucose excretion ( UGE ) . This study evaluated the safety , tolerability , pharmacokinetics and pharmacodynamics of repeated doses of RE in subjects with type 2 diabetes mellitus ( T2DM ) . METHODS In a double-blinded , r and omized , placebo-controlled trial , subjects who were drug-naïve or had metformin discontinued received RE [ 100 mg BID ( n = 9 ) , 1000 mg QD ( n = 9 ) , 1000 mg BID ( n = 9 ) ] , or placebo ( n = 8) for 12 days . Safety parameters were assessed , including urine studies to evaluate renal function . Plasma concentrations of RE and metabolites were measured with the first dose and at steady state . RE effects on glucose levels were assessed with fasting glucose concentrations , frequently sample d 24-h glucose profiles and oral glucose tolerance tests . RESULTS No significant laboratory abnormalities or safety events were reported ; the most frequent adverse events were headache and flatulence . Plasma exposure to RE and R were proportional to administered dose with negligible accumulation . Mean 24-h UGE increased in RE treatment groups . Compared with the placebo group , 24-h mean ( 95 % CI ) changes in plasma glucose were -1.2 ( -2.2 to -0.3 ) ( 100 mg BID ) , -0.8 ( -1.7 to 0.2 ) ( 1000 mg QD ) and -1.7 ( -2.7 to -0.8 ) mmol/l ( 1000 mg BID ) . CONCLUSIONS Administration of RE for 12 days is well-tolerated and results in clinical ly meaningful improvements in plasma glucose , accompanied by changes in body weight and blood pressure in subjects with T2DM BACKGROUND The sodium-dependent glucose co-transporter 2 ( SGLT2 ) is a high-capacity , low-affinity transport system primarily expressed in the renal proximal tubules , where it plays an important role in the regulation of glucose levels . Inhibition of SGLT2 represents an innovative approach for plasma glucose control in type 2 diabetes mellitus ( T2DM ) by blocking glucose reabsorption and enhancing glucose loss in the urine . METHODS This Phase 2 , r and omized , placebo-controlled study investigated the safety , tolerability , and pharmacokinetic and pharmacodynamic profiles of the novel oral SGLT2 inhibitor ipragliflozin ( ASP1941 ) in T2DM patients . Sixty-one patients were r and omized to placebo or ipragliflozin once daily at doses of 50 , 100 , 200 , or 300 mg for 28 days . Patients were admitted to the clinic during the study and received a weight-maintenance diet . RESULTS The incidence of treatment-emergent adverse events was similar for placebo and ipragliflozin groups . There were no deaths , and no patients discontinued ipragliflozin because of adverse events . Ipragliflozin was absorbed rapidly , taking approximately 1 h to reach the maximum concentration . The area under the concentration-time curve and maximum ipragliflozin concentration at steady state displayed dose linearity . All ipragliflozin doses significantly reduced glycosylated hemoglobin , fasting plasma glucose , and mean amplitude of glucose excursions compared with placebo . Significant dose-dependent increases in urinary glucose excretion were observed in all ipragliflozin groups . Mean weight decreased in the placebo and ipragliflozin groups , with greater reductions occurring in ipragliflozin-treated patients . CONCLUSION Ipragliflozin was generally safe , well tolerated , and effective at blocking renal glucose reabsorption and decreasing plasma glucose levels in T2DM patients AIM To evaluate the efficacy , safety , and tolerability of multiple doses of ipragliflozin . This novel selective inhibitor of sodium glucose co-transporter 2 is in clinical development for the treatment of patients with type 2 diabetes mellitus ( T2DM ) . METHODS In a 12-week , multicenter , double-blind , r and omized , active- and placebo-controlled dose-finding study , patients were r and omized to one of four ipragliflozin treatment groups ( 12.5 , 50 , 150 , and 300 mg once daily ) , placebo , or active control ( metformin ) . The primary efficacy outcome was the mean change from baseline to Week 12 of glycosylated hemoglobin ( HbA1c ) compared with placebo . RESULTS Ipragliflozin showed a dose-dependent decrease in HbA1c of -0.49 % to -0.81 % at Week 12 compared with placebo ( P<0.001 ) ; a decrease of -0.72 % was seen with metformin . Among the ipragliflozin groups there was also a dose-dependent reduction in body weight of up to 1.7 kg . Proportions of patients experiencing treatment-emergent adverse events were similar across all groups : ipragliflozin ( 45.7 - 58.8 % ) , placebo ( 62.3 % ) , and metformin ( 59.4 % ) . No clinical ly relevant effects were observed for other safety measures . CONCLUSIONS After 12 weeks of treatment , ipragliflozin dose-dependently decreased HbA1c , with ipragliflozin ≥50 mg/day in patients with T2DM ; an effect comparable to metformin . No safety or tolerability concerns were identified AIMS Many patients with type 2 diabetes are suboptimally managed with currently available therapies . Dapagliflozin , a sodium-glucose co-transporter-2 inhibitor , has shown efficacy in reducing diabetic hyperglycaemia . This study assessed efficacy of three lower doses in recently diagnosed patients . METHODS This phase 3 , r and omized , double-blind , placebo-controlled study assigned treatment-naïve patients to placebo or dapagliflozin monotherapy ( 1 , 2.5 or 5 mg ) daily for 24 weeks . Patients were antidiabetic drug-naïve with inadequate glycaemic control [ haemoglobin A1c ( HbA1c ) ≥7.0 and ≤10.0 % ] . The primary efficacy endpoint was change in HbA1c from baseline . Secondary endpoints included changes in body weight and fasting plasma glucose ( FPG ) , and proportions achieving HbA1c < 7 % . RESULTS A total of 282 patients with type 2 diabetes were r and omly assigned to one of four treatment groups . Baseline characteristics were similar across groups . At week 24 , mean HbA1c reduction was significantly greater with dapagliflozin : -0.68 % for 1 mg , -0.72 % for 2.5 mg , -0.82 % for 5 mg , versus 0.02 % for placebo ( p < 0.0001 ) ; compared to mean baseline values of 7.8 - 8.1 % . Mean FPG reduction was significantly greater for all dapagliflozin groups versus placebo ( p < 0.02 ) , as was mean weight reduction ( p < 0.003 ) . During the treatment period , 19.1 % of placebo-treated patients received rescue medication or discontinued because of poor glycaemic control versus 6.9 , 4.1 and 5.9 % for dapagliflozin 1 , 2.5 and 5 mg , respectively . Percentages of patients experiencing ≥1 adverse event were similar across groups . CONCLUSION Dapagliflozin at doses of 1 , 2.5 and 5 mg/day is effective in reducing glycaemic levels and body weight in treatment-naïve patients with type 2 diabetes . Dapagliflozin was generally well tolerated . This insulin-independent mechanism suggests a new treatment for type 2 diabetes AIM This Phase IIb , r and omized , double-blind , placebo-controlled trial evaluated the efficacy , safety , tolerability and pharmacokinetics of empagliflozin in patients with type 2 diabetes . METHODS Four hundred and eight patients ( treatment-naïve or after a 4-week wash-out period ) were r and omized to receive empagliflozin 5 , 10 or 25 mg once daily , placebo or open-label metformin for 12 weeks . The primary endpoint was change in haemoglobin A1c ( HbA1c ) after 12 weeks . RESULTS After 12 weeks ' treatment , empagliflozin showed dose-dependent reductions in HbA1c from baseline [ 5 mg : -0.4 % , 10 mg : -0.5 % , 25 mg : -0.6 % ; all doses p < 0.0001 vs. placebo ( + 0.09 % ) ] . Fasting plasma glucose ( FPG ) decreased with empagliflozin [ 5 mg : -1.29 mmol/l , 10 mg : -1.61 mmol/l , 25 mg : -1.72 mmol/l ; all doses p < 0.0001 vs. placebo ( + 0.04 mmol/l ) ] . Body weight decreased in all empagliflozin groups ( all doses p < 0.001 vs. placebo ) . The incidence of adverse events ( AEs ) was similar in the placebo ( 32.9 % ) and empagliflozin ( 29.1 % ) groups . The most frequently reported AEs on empagliflozin were pollakiuria ( 3.3 % vs. 0 % for placebo ) , thirst ( 3.3 % vs. 0 % for placebo ) and nasopharyngitis ( 2.0 % vs. 1.2 % for placebo ) . AEs consistent with urinary tract infections ( UTIs ) were reported in four ( 1.6 % ) patients on empagliflozin vs. one ( 1.2 % ) on placebo . Genital infections were reported in five ( 2 % ) patients on empagliflozin vs. 0 % on placebo . No UTIs or genital infections led to premature discontinuation . CONCLUSIONS In patients with type 2 diabetes , empagliflozin result ed in dose-dependent , clinical ly meaningful reductions in HbA1c and FPG , and reductions in body weight compared with placebo . Empagliflozin was well-tolerated with a favourable safety profile OBJECTIVE Although initially effective , sulfonylureas are associated with poor glycemic durability , weight gain , and hypoglycemia . Dapagliflozin , a selective inhibitor of sodium-glucose cotransporter 2 ( SGLT2 ) , reduces hyperglycemia by increasing urinary glucose excretion independent of insulin and may cause fewer of these adverse effects . We compared the efficacy , safety , and tolerability of dapagliflozin with the sulfonylurea glipizide in patients with type 2 diabetes inadequately controlled with metformin monotherapy . RESEARCH DESIGN AND METHODS This 52-week , double-blind , multicenter , active-controlled , noninferiority trial r and omized patients with type 2 diabetes ( baseline mean HbA1c , 7.7 % ) , who were receiving metformin monotherapy , to add-on dapagliflozin ( n = 406 ) or glipizide ( n = 408 ) up-titrated over 18 weeks , based on glycemic response and tolerability , to ≤ 10 or ≤ 20 mg/day , respectively . RESULTS The primary end point , adjusted mean HbA1c reduction with dapagliflozin ( -0.52 % ) compared with glipizide ( -0.52 % ) , was statistically noninferior at 52 weeks . Key secondary end points : dapagliflozin produced significant adjusted mean weight loss ( -3.2 kg ) versus weight gain ( 1.2 kg ; P < 0.0001 ) with glipizide , significantly increased the proportion of patients achieving ≥ 5 % body weight reduction ( 33.3 % ) versus glipizide ( 2.5 % ; p < 0.0001 ) , and significantly decreased the proportion experiencing hypoglycemia ( 3.5 % ) versus glipizide ( 40.8 % ; p < 0.0001 ) . Events suggestive of genital infections and lower urinary tract infections were reported more frequently with dapagliflozin compared with glipizide but responded to st and ard treatment and rarely led to study discontinuation . CONCLUSIONS Despite similar 52-week glycemic efficacy , dapagliflozin reduced weight and produced less hypoglycemia than glipizide in type 2 diabetes inadequately controlled with metformin . Long-term studies are required to further evaluate genital and urinary tract infections with SGLT2 inhibitors AIM Canagliflozin is a sodium-glucose co-transporter 2 ( SGLT2 ) inhibitor that is being investigated for the treatment of type 2 diabetes mellitus ( T2DM ) . METHODS This was a r and omized , double-blind , placebo-controlled , parallel-group , 28-day study conducted at two sites , in 29 subjects with T2DM not optimally controlled on insulin and up to one oral antihyperglycaemic agent . Subjects were treated with canagliflozin 100 mg QD or 300 mg twice daily ( BID ) or placebo . Safety , tolerability , pharmacokinetic characteristics and pharmacodynamic effects of canagliflozin were examined . Glucose malabsorption following a 75-g oral glucose challenge was also examined . RESULTS Canagliflozin pharmacokinetics were dose-dependent , and the elimination half-life ranged from 12 to 15 h. After 28 days , the renal threshold for glucose excretion was reduced ; urinary glucose excretion was increased ; and A1C , fasting plasma glucose and body weight decreased in subjects administered canagliflozin ( A1C reductions : 0.19 % with placebo , 0.73 % with 100 mg QD , 0.92 % with 300 mg BID ; body weight changes : 0.03 kg increase with placebo , 0.73 kg reduction with 100 mg QD , 1.19 kg reduction with 300 mg BID ) . Glucose malabsorption was not observed with canagliflozin treatment . There were no deaths , serious adverse events or severe hypoglycaemic episodes . The incidence of adverse events was similar across groups . There were no clinical ly meaningful changes in routine laboratory safety tests , vital signs or electrocardiograms . CONCLUSION In subjects receiving insulin and oral antihyperglycaemic therapy , canagliflozin was well tolerated without evidence for glucose malabsorption , had pharmacokinetic characteristics consistent with once-daily dosing , and improved glycaemic control OBJECTIVE Dapagliflozin , a novel inhibitor of renal sodium-glucose cotransporter 2 , allows an insulin-independent approach to improve type 2 diabetes hyperglycemia . In this multiple-dose study we evaluated the safety and efficacy of dapagliflozin in type 2 diabetic patients . RESEARCH DESIGN AND METHODS Type 2 diabetic patients were r and omly assigned to one of five dapagliflozin doses , metformin XR , or placebo for 12 weeks . The primary objective was to compare mean change from baseline in A1C . Other objectives included comparison of changes in fasting plasma glucose ( FPG ) , weight , adverse events , and laboratory measurements . RESULTS After 12 weeks , dapagliflozin induced moderate glucosuria ( 52–85 g urinary glucose/day ) and demonstrated significant glycemic improvements versus placebo ( ΔA1C −0.55 to −0.90 % and ΔFPG −16 to −31 mg/dl ) . Weight loss change versus placebo was −1.3 to −2.0 kg . There was no change in renal function . Serum uric acid decreased , serum magnesium increased , serum phosphate increased at higher doses , and dose-related 24-h urine volume and hematocrit increased , all of small magnitude . Treatment-emergent adverse events were similar across all groups . CONCLUSIONS Dapagliflozin improved hyperglycemia and facilitates weight loss in type 2 diabetic patients by inducing controlled glucosuria with urinary loss of ∼200–300 kcal/day . Dapagliflozin treatment demonstrated no persistent , clinical ly significant osmolarity , volume , or renal status changes
11,038
29,231,243
The very low- quality evidence available means that we are uncertain whether or not a multiple-session programme of NMES combined with exercise over several weeks versus exercise alone results in clinical ly important differences in knee pain and function at the end of the treatment period or at one year . There were no data on adverse effects such as muscle fatigue and discomfort .
BACKGROUND Patellofemoral pain syndrome , now generally referred to as patellofemoral pain ( PFP ) , is one of the most common orthopaedic disorders , characterised by pain in the anterior or retropatellar knee region . Neuromuscular electrical stimulation ( NMES ) has been proposed generally as a complementary treatment , associated with other interventions such as exercise , or as a single treatment to increase muscle force , reduce knee pain , and improve function . OBJECTIVES To assess the effects ( benefits and harms ) of neuromuscular electrical stimulation for people with patellofemoral pain .
Background Self-reported knee pain is highly prevalent among adolescents . As much as 50 % of the non-specific knee pain may be attributed to Patellofemoral Pain Syndrome ( PFPS ) . In the short term , exercise therapy appears to have a better effect than patient education consisting of written information and general advice on exercise or compared with placebo treatment . But the long-term effect of exercise therapy compared with patient education is conflicting . The purpose of this study is to examine the short- and long-term effectiveness of patient education compared with patient education and multimodal physiotherapy applied at a very early stage of the condition among adolescents . Methods / Design This study is a single blind pragmatic cluster r and omised controlled trial . Four upper secondary schools have been invited to participate in the study ( approximately 2500 students , aged 15 - 19 years ) . Students are asked to answer an online question naire regarding musculoskeletal pain . The students who report knee pain are contacted by telephone and offered a clinical examination by a rheumatologist . Subjects who fit the inclusion criteria and are diagnosed with PFPS are invited to participate in the study . A minimum of 102 students with PFPS are then cluster-r and omised into two intervention groups based on which school they attend . Both intervention groups receive written information and education . In addition to patient education , one group receives multimodal physiotherapy consisting primarily of neuromuscular training of the muscles around the foot , knee and hip and home exercises . The students with PFPS fill out self-reported question naires at baseline , 3 , 6 , 12 and 24 months after inclusion in the study . The primary outcome measure is perception of recovery measured on a 7-point Likert scale ranging from " completely recovered " to " worse than ever " at 12 months . Discussion This study is design ed to investigate the effectiveness of patient education compared with patient education combined with multimodal physiotherapy . If patient education and multimodal physiotherapy applied at an early stage of Patellofemoral Pain Syndrome proves effective , it may serve as a basis for optimising the clinical pathway for those suffering from the condition , where specific emphasis can be placed on early diagnosis and early treatment . Trial Registration clinical trials.gov reference : Objectives Describe proportions of individuals with patellofemoral pain ( PFP ) with an unfavourable recovery over 12 months ; identify clinical predictors of poor recovery at 3 and 12 months ; and determine baseline values of predictors that identify those with poor 12-month prognosis . Methods An observational analysis utilised data from 310 individuals with PFP enrolled in two r and omised clinical trials . Thirteen baseline variables ( participant , PFP , study characteristics ) were investigated for their prognostic ability . Pain , function and global recovery were measured at 3 and 12 months . Multivariate backward stepwise regression analyses ( treatment-adjusted , p<0.10 ) were performed for each follow-up measure . Receiver operator characteristic curves identified cut-points associated with unfavourable recovery at 12 months . Results 55 % and 40 % of participants had an unfavourable recovery at 3 and 12 months , respectively . Longer baseline pain duration was significantly associated with poor 3-month and 12-month recovery on measures of pain severity ( β 11.36 to 24.94 ) , Anterior Knee Pain ( AKP ) Scale ( −4.44 to −11.33 ) and global recovery ( OR : 2.32 to 6.11 ) . Greater baseline pain severity and lower AKP Scale score were significantly associated with poor recovery on multiple measures ( p<0.05 ) . Baseline duration > 2 months and AKP Scale score < 70/100 were associated with unfavourable 12-month recovery . Conclusions A substantial number of individuals with PFP have an unfavourable recovery over 12 months , irrespective of intervention . Knee pain duration > 2 months is the most consistent prognostic indicator , followed by AKP Scale score < 70 . Sports medicine practitioners should utilise interventions with known efficacy in reducing PFP , and promote early intervention to maximise prognosis . Trial registration Australian study : Australian Clinical Trials Registry ( ACTRN012605000463673 ) , Clinical Trials.gov ( NCT00118521 ) ; Dutch study : International St and ard R and omised Controlled Trial Number Register ( IS RCT N83938749 OBJECTIVE To determine the effects of electromyographic biofeedback treatment in patients with patellofemoral pain syndrome . DESIGN R and omized controlled trial . SETTING A physical medicine and rehabilitation department in a research hospital of a university referral center . PATIENTS Sixty patients with patellofemoral pain syndrome . Patients were r and omly placed into 2 groups : biofeedback group ( n = 30 ) and a control group ( n = 30 ) . INTERVENTION The biofeedback group received electromyographic biofeedback training and a conventional exercise program , whereas the control group received a conventional exercise program only . MAIN OUTCOME MEASURES Maximum and mean contraction values of the vastus medialis and the vastus lateralis muscles were assessed with the biofeedback device . Pain and functional status of the patients were measured by a visual analog scale ( VAS ) and the Functional Index Question naire ( FIQ ) , respectively . RESULTS Contraction values improved significantly at the end of the first month , compared with the pretreatment values in both groups . Mean contraction values in the biofeedback group of the vastus medialis muscles in all 3 monthly measurements , and the vastus lateralis muscles at the end of the first month , were significantly higher than those of the control group . Significant improvements were shown for both the VAS and the FIQ in both groups . Monthly follow-ups showed no VAS and FIQ differences between the groups . CONCLUSION Electromyographic biofeedback treatment did not result in further clinical improvement when compared with a conventional exercise program in patients with patellofemoral pain syndrome Purpose Neuromuscular electrical stimulation ( NMES ) with large electrodes and multiple current pathways ( m-NMES ) has recently been proposed as a valid alternative to conventional NMES ( c-NMES ) for quadriceps muscle (re)training . The main aim of this study was to compare discomfort , evoked force and fatigue between m-NMES and c-NMES of the quadriceps femoris muscle in healthy subjects . Methods Ten healthy subjects completed two experimental sessions ( c-NMES and m-NMES ) , that were r and omly presented in a cross-over design . Maximal electrically evoked force at pain threshold , self-reported discomfort at different levels of evoked force , and fatigue-induced force declines during and following a series of 20 NMES contractions were compared between c-NMES and m-NMES . Results m-NMES result ed in greater evoked force ( P < 0.05 ) and lower discomfort in comparison to c-NMES ( P < 0.05–0.001 ) , but fatigue time course and magnitude did not differ between the two conditions . Conclusions The use of quadriceps m-NMES appears legitimate for (re)training purpose s because it generated stronger contractions and was less discomfortable than c-NMES ( due to multiple current pathways and /or lower current density with larger electrodes ) Abstract Traditional conservative treatment for patellar disorders is successful in about 80 percents of cases . We introduced two new conservative treatment protocol s for patellar pathology in order to further improve the success rate . The first protocol consisted of high load/low repetition quadriceps femoris training ( 10 patients ) while the second enclosed selective electrostimulation of vastus medialis muscle ( 7 patients ) . Results were evaluated clinical ly and neurophysiologically . High load/low repetition training result ed in significant increase of maximal voluntary contraction of quadriceps muscle ( P < 0.001 ) . Significant gain of Activity ( P = 0.017 ) and Kujala scores ( P = 0.07 ) was observed in group with high load/low repetition quadriceps training compared to patients with electrostimulation . There was no significant change in neurophysiological or clinical status between the beginning and the end of treatment with electrostimulation . Our results indicate that high load/low repetition quadriceps femoris training poses an important alternative to traditional conservative treatment protocol for patellar disorders OBJECTIVE To compare a commercially available electric muscle stimulation regimen with a novel form of stimulation for the rehabilitation of the quadriceps muscle , in patients with patellofemoral pain syndrome . DESIGN Double-blinded r and omized trial with a parallel control group and stratified r and omization . SETTING Home-based rehabilitation program assessed in research center . PARTICIPANTS Eighty patients ( 47 women , 33 men ) with patellofemoral pain syndrome . INTERVENTIONS One group ( EMPI ) received 1 uniform constant frequency component of 35Hz . The other ( EXPER ) group received an experimental form of stimulation that contained 5 simultaneously delivered frequency components of 125 , 83 , 50 , 2.5 , and 2Hz . Stimulation was applied to the quadriceps muscles of the affected leg for 1 hour daily for 6 weeks , a total of 42 treatments . MAIN OUTCOME MEASURES Lower-limb isometric and isokinetic torque , quadriceps fatigue , knee flexion , patellar pain , a step test , quadriceps cross-sectional area , and Kujala patellofemoral score for pain before and after treatment . RESULTS Seventy-four patients ( 43 women , 31 men ) completed the trial . Patients in both groups showed significant improvements in all outcomes ( P<.05 ) . No significant differences existed between the 2 stimulators in any outcome ( P>.05 ) except for quadriceps cross-sectional area ( P=.023 ) . CONCLUSIONS One form of stimulation was just as efficacious as the other in improving subjective and objective measures A linear analogue for rating pain with 10 , 15 and 20 cm lines is significantly less variable than a 5 cm line ( mean error of 15 cm line is 0 - 19 % , 95 % confidence limits for the group + /- 2 % and an inood correlation between repeated ratins of a recalled pain distant in time . The variance of the rating is significantly less than the repeated rating of a r and om mark . The linear analogue rating of a constant pain stimulus is reproducible and changes in rating are likely to be real changes of opinion . Pethidine 150 mg intramuscularly had no significant effect , tested 30 minutes after the administration , on the accuracy or reproducibility of the analogue rating . A linear analogue seems a suitable method of recording the patient 's opion of a severe pain such as that of labour The aim of this study was to assess the clinical efficacy and safety of NMES program applied in male soccer players ( after ACL reconstruction ) on the quadriceps muscle . The 80 participants ( NMES = 40 , control = 40 ) received an exercise program , including three sessions weekly . The individuals in NMES group additionally received neuromuscular electrical stimulation procedures on both right and left quadriceps ( biphasic symmetric rectangular pulses , frequency of impulses : 2500 Hz , and train of pulses frequency : 50 Hz ) three times daily ( 3 hours of break between treatments ) , 3 days a week , for one month . The tensometry , muscle circumference , and goniometry pendulum test ( follow-up after 1 and 3 months ) were applied . The results of this study show that NMES ( in presented parameters in experiment ) is useful for strengthening the quadriceps muscle in soccer athletes . There is an evidence of the benefit of the NMES in restoring quadriceps muscle mass and strength of soccer players . In our study the neuromuscular electrical stimulation appeared to be safe for biomechanics of knee joint . The pathological changes in knee function were not observed . This trial is registered with Australian and New Zeal and Clinical Trials Registry ACTRN12613001168741 PURPOSE Although physical therapy is known to be effective in treating patellofemoral pain ( PFP ) , there is considerable individual variation in the treatment response . It is unclear why some patients benefit from a specific treatment while others do not experience improvement . This study , using a prospect i ve study design , aims to identify factors that could predict the short-term functional outcome and account for the variation frequently seen in the outcome after conservative treatment of PFP . METHODS Thirty-six patients ( 20 female and 16 male with a mean age of 23.8 ± 6.7 yr ) followed a physical therapy rehabilitation program of 7 wk . Before this treatment , all patients were evaluated on subjective symptoms ( pain on visual analog scales in millimeters ) and functional performance ( step test expressed as highest level , single-legged hop test in centimeters , and triple-hop test in centimeters ) . The concentric and eccentric knee extensor strength at 60 ° .s(-1 ) and 240 ° .s(-1 ) ( N.m ) were measured as well as the quadriceps muscle size by calculating the cross-sectional area ( cm(2 ) ) with magnetic resonance imaging . The success of the treatment was evaluated by the functional Kujala anterior knee pain scale . A linear regression model was used to identify predisposing factors for the functional outcome . RESULTS The total quadriceps cross-sectional area ( P = 0.010 ) , the eccentric average peak torque at 60 ° .s(-1 ) ( P = 0.015 ) , and the frequency of pain at baseline ( P = 0.012 ) have been indicated as predisposing variables in the short-term functional outcome after a physical therapy rehabilitation program for PFP ( adjusted R(2 ) = 0.46 ) . CONCLUSION Patients with a greater quadriceps muscle size , lower eccentric knee strength , and less pain have a better short-term functional outcome after conservative treatment for PFP OBJECTIVE To examine the test-retest reliability , validity , and responsiveness of several outcome measures in the treatment of patellofemoral pain . DESIGN Evaluation of the clinimetric properties of individual outcome measures for patellofemoral pain treatment , using data collected from a previously published r and omized controlled trial ( RCT ) . SETTING General community and private practice . PARTICIPANTS The data from 71 persons enrolled in an RCT of a conservative intervention for patellofemoral pain were used to evaluate the measures ' validity and responsiveness . A subset of this cohort ( n=20 ) was used to assess reliability . INTERVENTIONS Not applicable . MAIN OUTCOME MEASURES Three 10-cm visual analog scales ( VASs ) for usual pain ( VAS-U ) , worst pain ( VAS-W ) , and pain on 6 aggravating activities ( walking , running , squatting , sitting , ascending and descending stairs ) ( VAS-activity ) ; the Functional Index Question naire ( FIQ ) ; the Anterior Knee Pain Scale ( AKPS ) ; and the global rating of change . RESULTS The test-retest reliability ranged from poor ( intraclass correlation coefficient [ICC]=.49 ) to good ( ICC=.83 ) , and the measures correlated moderately with each other ( r range,.56-.72 ) . Median change scores differed significantly between improved and unimproved persons for all measures . The effect sizes for VAS-U ( .79 ) , VAS-W ( .88 ) , and the AKPS ( .98 ) were large , indicating greater responsiveness than the FIQ ( .37 ) and VAS-activity ( .66 ) . Similarly , the AKPS and VAS-W were the most efficient measures for detecting a treatment effect when compared with a reference measure ( VAS-U , which was assigned a value of 1 ) . The minimal difference that patients or clinicians consider clinical ly important for the AKPS is 10 ( out of 100 ) points and for the VAS it is 2 cm ( out of 10 cm ) . CONCLUSIONS The AKPS and VAS for usual or worst pain are reliable , valid , and responsive and are therefore recommended for future clinical trials or clinical practice in assessing treatment outcome in persons with patellofemoral pain OBJECTIVES To evaluate the beneficial effect of training in patients with patellofemoral pain syndrome ( PFPS ) and influence of additional electric muscle stimulation ( EMS ) of the knee extensor muscles . DESIGN A r and omized clinical trial . SETTING Supervised physiotherapy ( PT ) training and home-based EMS . PARTICIPANTS Patients ( N=38 ; 14 men , 24 women ) with bilateral PFPS . INTERVENTIONS One group ( PT ) received supervised PT training for 12 weeks . The other received PT and EMS . The stimulation protocol was applied to the knee extensors for 20 minutes , 2 times daily , 5 times a week for 12 weeks at 40 Hz , with a pulse duration of .2 6ms , at 5 seconds on and 10 seconds off . Maximal tolerable stimulation intensity was up to 80 mA. MAIN OUTCOME MEASURES Patellofemoral pain assessment with visual analog scale during activities of daily life , Kujala patellofemoral score , and isometric strength measurement before and after 12 weeks treatment as well as after 1 year . RESULTS Thirty-six patients completed the 12-week follow-up . There was a statistically significant reduction of pain in both groups ( PT group , P=.003 ; PT and EMS group , P<.001 ) and significant improvement of the Kujala score in both groups ( PT group , P<.001 ; PT and EMS group , P<.001 ) after 12 weeks of treatment with improvement of function and reduction of pain at the 1-year follow-up . The difference between the 2 treatment groups was statistically not significant . We could not measure any significant change in isometric knee extensor strength in either group . CONCLUSIONS A supervised PT program can reduce pain and improve function in patients with PFPS . We did not detect a significant additional effect of EMS with the protocol described previously Objective Patellar malalignment is a major cause of patellofemoral pain syndrome ( PFPS ) , but the relationship between clinical symptoms and changes in patellar position and knee muscle strength has not been confirmed . This study examined the effect of weight training on hip and knee muscle strength , patellofemoral joint contact area , and patellar tilt on subjects with and without PFPS , hoping to develop an optimal rehabilitation protocol for subjects with PFPS . Design The study uses a prospect i ve independent group comparison . Fifteen subjects with and without PFPS were assessed for knee strength , patellofemoral joint contact area , and patellar tilt angle using magnetic resonance imaging . The subjects with PFPS were also examined and given a numeric pain rating score and a Kujala patellofemoral score . The subjects performed lower-limb weight training 3 times/wk for 8 wks , and the outcomes were assessed both before and after training . Results Subjects with PFPS have increased their patellofemoral joint contact area after weight training ( P < 0.001 ) . No statistical significant change was found on the patellar tilt angle . The isometric and isokinetic knee strength in subjects with and without PFPS have increased after weight training ( P value increased from 0.007 to 0.05 ) . Both numeric pain rating and Kujala patellofemoral score in the PFPS group improved after training ( P < 0.001 ) . Conclusions Weight-training exercise increased knee muscle strength and the patellofemoral joint contact area , which could reduce mechanical stress in the joint , improving pain and function in subjects with PFPS Background Although physical therapy forms the mainstay of nonoperative management for patellofemoral pain , its efficacy has not been established . Hypothesis Significantly more pain relief will be achieved from a 6-week regimen of physical therapy than from placebo treatment . Study Design Multicenter , r and omized , double-blinded , placebo-controlled trial . Methods Seventy-one subjects , 40 years of age or younger with patellofemoral pain of 1 month or longer , were r and omly allocated to a physical therapy or placebo group . A st and ardized treatment program consisted of six treatment sessions , once weekly . Physical therapy included quadriceps muscle retraining , patellofemoral joint mobilization , and patellar taping , and daily home exercises . The placebo treatment consisted of sham ultrasound , light application of a nontherapeutic gel , and placebo taping . Results Sixty-seven participants completed the trial . The physical therapy group ( N = 33 ) demonstrated significantly greater reduction in the scores for average pain , worst pain , and disability than did the placebo group ( N = 34 ) . Conclusions A six-treatment , 6-week physical therapy regimen is efficacious for alleviation of patellofemoral pain Objective : To compare a commercially available electrical muscle stimulation regime with a new form of stimulation for the rehabilitation of the quadriceps in patients with patellofemoral pain syndrome . Setting : A research facility within a teaching hospital . Methods : Sixteen patients ( four men , 12 women ) with patellofemoral pain , demonstrable quadriceps atrophy , but normal gait parameters were r and omly allocated to one of two treatment groups . One group received a sequential mixed frequency stimulation pattern from a st and ard device . The other group received a new form of stimulation from an experimental stimulation device that contained simultaneous mixed frequency components . Outcome measures : Isometric and isokinetic extension torque , muscle fatigue rate , pain , functional question naire , step test , knee flexion , and quadriceps cross-sectional area . Results : These showed significant improvements for both groups after treatment ( p < 0.05 ) in all outcome measures except flexion and fatigue rates , but no significant differences between the two stimulation regimes ( p > 0.05 ) . Conclusion : Both stimulators performed similarly on patients with patellofemoral pain giving significant improvements for all patients for muscle strength , pain , self-reporting function and step testing . There were no significant differences between the two types of stimulation Purpose The aim of this study was to assess muscle torque , total volume , and cross-sectional area , and lower limb function of the quadriceps muscle in women with unilateral patellofemoral pain syndrome ( PFPS ) . Methods Twenty-four women with unilateral patellofemoral pain participated in the study , with each subject acting as their own internal control by using the unaffected limb . quadriceps muscle torque was measured with the Isomed 2000 ® . The total volume and cross-sectional area ( CSA ) of the quadriceps muscle were measured by using magnetic resonance imaging . Lower limb function was assessed by hop and step-down tests . Results There was a significant difference in the total volume ( P < 0.05 ) and in the cross-sectional area ( P < 0.05 ) of the quadriceps muscle between affected and unaffected sides . There was a significant difference in the peak torque of the quadriceps muscle at 60 ° /s between affected and unaffected sides ( P < 0.05 ) . There were significant correlations between quadriceps largest CSA and volume on the affected side ( P < 0.05 ) and on the unaffected side ( P < 0.05 ) . There were significant negative correlations between the smallest CSA and the peak torque at 180 ° /s ( P < 0.05 ) and at 60 ° /s ( P < 0.05 ) on the affected side . Conclusions Decreased torque , total volume , and CSA of the quadriceps muscle are presented in unilateral with PFPS although cause or effect can not be established . Large prospect i ve longitudinal studies are needed to detect the changes in the muscle structure and to establish whether these features are a cause of PFPS The CONSORT statement is used worldwide to improve the reporting of r and omised controlled trials . Kenneth Schulz and colleagues describe the latest version , CONSORT 2010 , which up date s the reporting guideline based on new method ological evidence and accumulating experience . To encourage dissemination of the CONSORT 2010 Statement , this article is freely accessible on bmj.com and will also be published in the Lancet , Obstetrics and Gynecology , PLoS Medicine , Annals of Internal Medicine , Open Medicine , Journal of Clinical Epidemiology , BMC Medicine , and Trials
11,039
30,221,454
Conclusions Despite widespread calls for shared decision making to be embedded in health care , there is little evidence to inform shared decision making for hypertension , one of the most common conditions managed in primary care
Abstract Background Hypertension ( high blood pressure ) is a common long‐term health condition . Patient involvement in treating and monitoring hypertension is essential . Control of hypertension improves population cardiovascular outcomes . However , for an individual , potential benefits and harms of treatment are finely balanced . Shared decision making has the potential to align decisions with the preferences and values of patients . Objective Determine the effectiveness of interventions to support shared decision making in hypertension .
A r and omized controlled trial of an information and medical record booklet design ed to improve patient underst and ing and participation in the management of hypertension was conducted in six inner London general practice s. After one year there were no significant differences between the group who had received the booklets and the control group in mean systolic or diastolic blood pressure , but the study group scored significantly higher on knowledge about hypertension and its management . However , the difference between the two groups was small , possibly because both groups started with a high level of underst and ing about hypertension and its management . In addition , the mean diastolic blood pressure in the control group showed that the treatment provided was already satisfactory , and that there was little need for improvement . Nevertheless , the information booklet evaluated in this study provides health professionals with a highly acceptable method of informing the patient about hypertension and its management and could be used both in hospital and general practice Background Hypertension is one of the key factors causing cardiovascular diseases which make up the most frequent cause of death in industrialised nations . However about 60 % of hypertensive patients in Germany treated with antihypertensives do not reach the recommended target blood pressure . The involvement of patients in medical decision making fulfils not only an ethical imperative but , furthermore , has the potential of higher treatment success . One concept to enhance the active role of patients is shared decision making . Until now there exists little information on the effects of shared decision making trainings for general practitioners on patient participation and on lowering blood pressure in hypertensive patients . Methods / Design In a cluster-r and omised controlled trial 1800 patients receiving antihypertensives will be screened with 24 h ambulatory blood pressure monitoring in their general practitioners ’ practice s. Only patients who have not reached their blood pressure target ( approximately 1200 ) will remain in the study ( T1 – T3 ) . General practitioners of the intervention group will take part in a shared decision making-training after baseline assessment ( T0 ) . General practitioners of the control group will treat their patients as usual . Primary endpoints are change of systolic blood pressure and change of patients ’ perceived participation . Secondary endpoints are changes of diastolic blood pressure , knowledge , medical adherence and cardiovascular risk . Data analysis will be performed with mixed effects models . Discussion The hypothesis underlying this study is that shared decision making , realised by a shared decision making training for general practitioners , activates patients , facilitates patients ’ empowerment and contributes to a better hypertension control . This study is the first one that tests this hypothesis with a ( cluster- ) r and omised trial and a large sample size . Trial registration WHO International Clinical Trials : http://apps.who.int/trial search /Trial.aspx?TrialID = Background To improve risk factor management in diabetes , we need to support effective interactions between patients and healthcare providers . Our aim is to develop and evaluate a treatment decision aid that offers personalised information on treatment options and outcomes , and is intended to empower patients in taking a proactive role in their disease management . Important features are : ( 1 ) involving patients in setting goals together with their provider ; ( 2 ) encourage them to prioritise on treatments that maximise relevant outcomes ; and ( 3 ) integration of the decision aid in the practice setting and workflow . As secondary aim , we want to evaluate the impact of different presentation formats , and learn more from the experiences of the healthcare providers and patients with the decision aid . Methods and design We will conduct a r and omised trial comparing four formats of the decision aid in a 2 × 2 factorial design with a control group . Patients with type 2 diabetes managed in 18 to 20 primary care practice s in The Netherl and s will be recruited . Excluded are patients with a recent myocardial infa rct ion , stroke , heart failure , angina pectoris , terminal illness , cognitive deficits , > 65 years at diagnosis , or not able to read Dutch . The decision aid is offered to the patients immediately before their quarterly practice consultation . The same decision information will be available to the healthcare provider for use during consultation . In addition , the providers receive a set of treatment cards , which they can use to discuss the benefits and risks of different options . Patients in the control group will receive care as usual . We will measure the effect of the intervention on patient empowerment , satisfaction with care , beliefs about medication , negative emotions , health status , prescribed medication , and predicted cardiovascular risk . Data will be collected with question naires and automated extraction from medical records in 6 months before and after the intervention . Discussion This decision aid is innovative in supporting patients and their healthcare providers to make shared decisions about multiple treatments , using the patient ’s data from electronic medical records . The results can contribute to the further development and implementation of electronic decision support tools for the management of chronic diseases . Trial registration Dutch Trial register NTR1942 CONTEXT Decision aids can increase patient involvement in treatment decision making . However , questions remain regarding their effects and cost implication s. OBJECTIVE To evaluate the effects of information , with and without a structured preference elicitation interview , on treatment choices , health outcomes , and costs . DESIGN , SETTING , AND PARTICIPANTS A r and omized controlled trial with 2 years of follow-up . Between October 1996 and February 1998 , 894 women with uncomplicated menorrhagia were recruited from 6 hospitals in southwest Engl and . Women were r and omized to the control group , information alone group ( information ) , or information plus interview group ( interview ) . INTERVENTIONS Women in both intervention groups were sent an information pack ( a booklet and complementary videotape ) 6 weeks before their specialist consultation . Immediately before their consultation , women in the interview group underwent structured interview , to clarify and elicit their preferences . MAIN OUTCOME MEASURES Self-reported health status was the main outcome ; secondary outcomes included treatments received and costs . Cost analyses adopted a UK health service ( payer ) perspective , and were based on patient-reported re source use data and are reported in 1999 - 2000 US dollars . RESULTS The interventions had no consistent effect on health status . Hysterectomy rates were lower for women in the interview group ( 38 % ) ( adjusted odds ratio [ OR ] , 0.60 ; 95 % confidence interval [ CI ] , 0.38 - 0.96 ) than in the control group ( 48 % ) and women who received the information alone ( 48 % ) ( adjusted OR , 0.52 ; 95 % CI , 0.33 - 0.82 ) . The interview group had lower mean costs ( $ 1566 ) than the control group ( $ 2751 ) ( mean difference , $ 1184 ; 95 % CI , $ 684-$2110 ) and the information group $ 2026 ( mean difference , $ 461 ; 95 % CI , $ 236-$696 ) . CONCLUSIONS Neither intervention had an effect on health status . Providing women with information alone did not affect treatment choices ; however , the addition of an interview to clarify values and elicit preferences had a significant effect on women 's management and result ed in reduced costs The authors assessed whether patient empowerment in the management of hypertension improved more with the practice of shared decision making ( SDM ) than by education programs . In a prospect i ve controlled clinical study , 15 general practitioners in Nuremberg , Germany who were specially trained to conduct SDM consultations participated in a 12-month study . Hypertensive patients ( N=86 ) were included ; N=40 were in the SDM group and N=46 were in the control group , if blood pressures were > or = 135 / 85 mm Hg ( self measurement ) and patients had no signs of cardiovascular complications or severe hypertension . All participants in the SDM group and the control group were enrolled in an education program on hypertension in small groups . The SDM group participants also had 4 special consultations to share medical decisions . The main outcome measures were the effect of SDM on blood pressure control . After 1 year blood pressure had decreased in all participants : Delta-9.26 + /- 10.2 mm Hg/Delta-5.33 + /- 9.5 mm Hg in the SDM group ( P<0.001 ) compared to Delta-6.0 + /- 11.8 mm Hg/Delta-3.0 + /- 8.3 mm Hg in the control group . There was no significant difference between the 2 groups . The study group practice d more SDM than controls , but blood pressure control was not significantly better . Patient empowerment by means of an education program in small groups and creating awareness of hypertensive disease helps to improve the outcome of hypertension treatment . SDM , however , did not improve management when compared to an education program , which is much easier to implement in general practice Concordance -- that is , shared decision-making between doctors and patients --is nowadays accepted as an integral part of good clinical practice . It is of particular importance in the case of treatments with only marginal benefits such as those recommended in guidelines for the management of common , chronic diseases . However , the implementation of guideline -based medicine conflicts with that of concordance . Studies indicate that patients are not adequately informed about their treatment . Clinical guidelines for conditions such as cardiovascular disease are based on large-scale r and omized trials and the complex nature of the data limits effective communication especially in an environment characterized by time constraints . But other factors may be more relevant , notably pressures to comply with guidelines and financial rewards for meeting targets : it is simply not in the interests of doctors to disclose accurate information . Studies show that patients are far from impressed by the small benefits derived from large scale trials . Indeed , faced with absolute risk reductions , patients decline treatment promoted by guidelines . To participate in clinical decisions , patients require unbiased information concerning outcomes with and without treatment , and the absolute risk reduction ; they should be told that most patients receiving long-term medication obtain no benefit despite being exposed to adverse drug reactions ; furthermore , they should be made aware of the question able validity of large-scale trials and that these studies may be influenced by those with a vested interest . Genuine concordance will inevitably lead to many patients rejecting the recommendations of guidelines and encourage a more critical approach to clinical research and guideline -based medicine Background Hypertension is one of the key factors causing cardiovascular diseases . A substantial proportion of treated hypertensive patients do not reach recommended target blood pressure values . Shared decision making ( SDM ) is to enhance the active role of patients . As until now there exists little information on the effects of SDM training in antihypertensive therapy , we tested the effect of an SDM training programme for general practitioners ( GPs ) . Our hypotheses are that this SDM training ( 1 ) enhances the participation of patients and ( 2 ) leads to an enhanced decrease in blood pressure ( BP ) values , compared to patients receiving usual care without prior SDM training for GPs . Methods The study was conducted as a cluster r and omised controlled trial ( c RCT ) with GP practice s in Southwest Germany . Each GP practice included patients with treated but uncontrolled hypertension and /or with relevant comorbidity . After baseline assessment ( T0 ) GP practice s were r and omly allocated into an intervention and a control arm . GPs of the intervention group took part in the SDM training . GPs of the control group treated their patients as usual . The intervention was blinded to the patients . Primary endpoints on patient level were ( 1 ) change of patients ’ perceived participation ( SDM-Q-9 ) and ( 2 ) change of systolic BP ( 24h-mean ) . Secondary endpoints were changes of ( 1 ) diastolic BP ( 24h-mean ) , ( 2 ) patients ’ knowledge about hypertension , ( 3 ) adherence ( MARS-D ) , and ( 4 ) cardiovascular risk score ( CVR ) . Results In total 1357 patients from 36 general practice s were screened for blood pressure control by ambulatory blood pressure monitoring ( ABPM ) . Thereof 1120 patients remained in the study because of uncontrolled ( but treated ) hypertension and /or a relevant comorbidity . At T0 the intervention group involved 17 GP practice s with 552 patients and the control group 19 GP practice s with 568 patients . The effectiveness analysis could not demonstrate a significant or relevant effect of the SDM training on any of the endpoints . Conclusion The study hypothesis that the SDM training enhanced patients ’ perceived participation and lowered their BP could not be confirmed . Further research is needed to examine the impact of patient participation on the treatment of hypertension in primary care . Trial registration German Clinical Trials Register ( DRKS ) : Background Disparities in health and healthcare are extensively documented across clinical conditions , setting s , and dimensions of healthcare quality . In particular , studies show that ethnic minorities and persons with low socioeconomic status receive poorer quality of interpersonal or patient-centered care than whites and persons with higher socioeconomic status . Strong evidence links patient-centered care to improvements in patient adherence and health outcomes ; therefore , interventions that enhance this dimension of care are promising strategies to improve adherence and overcome disparities in outcomes for ethnic minorities and poor persons . Objective This paper describes the design of the Patient-Physician Partnership ( Triple P ) Study . The goal of the study is to compare the relative effectiveness of the patient and physician intensive interventions , separately , and in combination with one another , with the effectiveness of minimal interventions . The main hypothesis is that patients in the intensive intervention groups will have better adherence to appointments , medication , and lifestyle recommendations at three and twelve months than patients in minimal intervention groups . The study also examines other process and outcome measures , including patient-physician communication behaviors , patient ratings of care , health service utilization , and blood pressure control . Methods A total of 50 primary care physicians and 279 of their ethnic minority or poor patients with hypertension were recruited into a r and omized controlled trial with a two by two factorial design . The study used a patient-centered , culturally tailored , education and activation intervention for patients with active follow-up delivered by a community health worker in the clinic . It also included a computerized , self- study communication skills training program for physicians , delivered via an interactive CD-ROM , with tailored feedback to address their individual communication skills needs . Conclusion The Triple P study will provide new knowledge about how to improve patient adherence , quality of care , and cardiovascular outcomes , as well as how to reduce disparities in care and outcomes of ethnic minority and poor persons with hypertension Patients who ask questions , elicit treatment options , express opinions , and state preferences about treatment during office visits with physicians have measurably better health outcomes than patients who do not [ 1 - 5 ] . Patients who feel that they have participated in decision making are more likely to follow through on those decisions than those who do not [ 6 ] . This type of involvement has particular relevance for chronic disease care , in which most of the treatment plan must be carried out by patients . To achieve maximum treatment effectiveness , physicians must persuade patients with chronic diseases to commit to and follow through on treatment recommendations . Physicians who routinely involve patients with chronic diseases in treatment decisions ( presenting options , discussing the pros and cons of those options , eliciting patient preferences , and reaching mutually agreed-on treatment plans ) can be said to have a shared or participatory decision-making style . Such physicians may have greater success in securing patient cooperation and therefore may have better patient health outcomes than physicians with more controlling decision-making styles . Although research has emphasized greater patient involvement in treatment decisions , less attention has been paid to the physician as an agent for obtaining this result . In studies of patients with hypertension [ 1 ] , noninsulin-dependent diabetes mellitus [ 2 , 5 ] , peptic ulcer disease [ 3 ] , and rheumatoid arthritis [ 7 ] , patients whose physicians were less controlling ( or more participatory ) during office visits had better functional status and lower follow-up glycosylated hemoglobin levels , blood pressure , and arthritis severity than patients of less participatory physicians . These studies suggest that physicians vary widely in how much they facilitate patients ' active participation in treatment decisions ( what we have termed a participatory decision-making style ) . Our study was part of the Medical Outcomes Study , an exploration of differences in the organization of health care delivery and physician reimbursement mechanisms and specialty , and their effect on patients ' health outcomes . We examined personal and practice characteristics of physicians likely to influence variations in participatory decision-making style , defined as the propensity to offer patients choices among treatment options and to give them a sense of control and responsibility for care . We relate physicians ' background and training , sociodemographic characteristics , practice volume , and satisfaction with professional autonomy to the patient-reported measure of physicians ' participatory decision-making styles , after adjusting for patient and practice characteristics . Methods Study Design The study design and sampling strategies of the Medical Outcomes Study have been described [ 8 , 9 ] . Data for this study derive from a cross-sectional sample of patients obtained from the offices of physicians participating in the Medical Outcomes Study during a 9-day study enrollment period in 1986 . Physician and Patient Sample s Physicians 31 to 55 years of age who were board eligible or board certified in general internal medicine , family practice , cardiology , or endocrinology were included . The result ing sample of 337 physicians was restricted to the 300 who completed a clinician background question naire . These physicians comprised 28 cardiologists ( 9.3 % ) , 20 endocrinologists ( 6.7 % ) , 79 family physicians ( 26.3 % ) , and 173 general internists ( 57.7 % ) . They were primarily young ( mean age SD , 39.2 6.3 years ) , white ( 83.7 % ) men ( 78.5 % ) . One hundred fifty-five ( 51.7 % ) physicians practice d in solo or single-specialty group practice s ; 52 ( 17.3 % ) , in multispecialty groups ; and 93 ( 31.0 % ) , in health maintenance organizations . Of 22 463 adult patients of physicians participating in the Medical Outcomes Study , a r and om half were asked about participatory decision-making style on their screening question naire [ 10 ] . The patient sample for this analysis was further restricted to only patients ( n = 7730 ) of the 300 physicians who completed the clinician background question naire . The average age of patients included in this study was 46.5 17.7 years ( range , 18 to 106 years ) ; 20.2 % were 65 years of age or older . The patients had completed 13.7 3.0 years of education ; most ( 61.3 % ) were women , and 21.7 % were nonwhite . More than half ( 53 % ) had one or more major chronic conditions [ 8 ] . Data Collection Data for this study were taken from the Medical Outcomes Study patient and physician screening visit question naires and the clinician background question naire [ 10 ] . The three questions measuring participatory decision-making style were included on the patient screening visit question naire . Patients were asked to rate their physicians ' style on a five-point scale , using the following questions : 1 ) If there were a choice between treatments , would this doctor ask you to help make the decision ? [ definitely yes to definitely no ] ; 2 ) How often does this doctor make an effort to give you some control over your treatment ? [ very often to never ] ? ; and 3 ) How often does this doctor ask you to take some of the responsibility for your treatment ? ( very often to not at all ) ? We created a scaled score by calculating a sum of these items . We then transformed this scale to range from 0 to 100 by subtracting the lowest possible value of the scale from the raw score , dividing the result by the maximum possible range of scores , and multiplying that result by 100 . The mean SD , averaged across each physician 's patients , for this transformed three-item scale along with its internal consistency reliability are presented in Table 1 . The average interclass correlation among scores for each physician was relatively high ( r = 0.62 ; P < 0.001 ) . The average number of patients providing participatory decision-making style scores for each physician was 25.8 11.2 . Table 1 . Summary of Major Study Variables Questions on the personal and practice characteristics of physicians , from the clinician background question naire , included practice volume ( measured as the number of outpatient visits that physicians reported having in a typical week of their main professional practice ) , participation in primary care and interviewing skills training during residency , and satisfaction with professional autonomy . Items were rated on a five-point scale , ranging from very satisfied to very dissatisfied . On the physician screening visit question naire , physicians were also asked how long they spent face to face with each study patient for the office visit . Because physicians might see the same volume of patients distributed over more or less concentrated blocks of time , we also measured time spent per patient to assess practice volume . Analyses using time spent per patient compared with practice volume produced highly similar results [ 11 ] . Analytic Strategy The analysis involved two stages . To minimize the effects of patient characteristics on physician style , we did a patient-level multiple regression analysis using patient-reported participatory decision-making style as the dependent variable and the patient 's age , the patient 's education level , the patient 's sex , the patient 's minority status , type of payment for health care , presence of eight selected common chronic conditions , the patient 's perceived health status , and each physician 's identification number ( entered as dummy variables ) as independent variables and covariates . We used individual physician identification numbers to remove the variation in physicians ' participatory decision-making style that might have been due to unobserved physician characteristics . The results of this patient-level regression analysis [ 11 ] showed that patients who were older than 75 years of age or younger than 30 years of age , had less than a high school education , were not white , were male , had more comorbid conditions and poorer perceived health status , and reported a shorter relationship with the Medical Outcomes Study physician reported less participatory visits . Twenty-one percent of patients were seeing the Medical Outcomes Study physician for the first time , and another 6 % were seeing the physician on an ongoing consultative basis . Because first and consultative visits were similarly distributed across the physician specialty groups and did not account for the observed differences in participatory decision-making style among the specialties , we retained these visits in the analysis . To account for differences in these patient characteristics in Medical Outcomes Study physicians ' patient sample s , we used this patient-level regression model to calculate an expected participatory decision-making style score for each Medical Outcomes Study physician . We then subtracted this expected score from the observed score ( patients ' actual style reports ) for each physician . In all analyses , we have used the difference between this observed and the regression-generated expected participatory decision-making style score , transformed to range from 0 to 100 using the technique described above , as the dependent variable in statistical models . The use of the observed score minus the expected score results in a narrow numeric range but minimizes the variation in participatory decision-making style that would otherwise have been due to differences in Medical Outcomes Study physicians ' patient panels . Observed differences in participatory decision-making style related to physician characteristics ranged from 33.2 % to 80.6 % of the adjusted participatory decision-making observed style score minus the expected style score 's st and ard deviation . In the second stage of the analysis , we developed physician-level models , relating the observed score minus the regression-generated expected score to physician 's age , sex , minority status , specialty , site of care , and type of practice ( health maintenance organization , fee-for-service ) , entered as dummy variables , and statistical Objective To assess the effects of a patient oriented decision aid for prioritising treatment goals in diabetes compared with usual care on patient empowerment and treatment decisions . Design Pragmatic r and omised controlled trial . Setting 18 general practice s in the north of the Netherl and s. Participants 344 patients with type 2 diabetes aged ≤65 years at the time of diagnosis and managed in primary care between April 2011 and August 2012 : 225 were allocated to the intervention group and 119 to the usual care group . Intervention The intervention comprised a decision aid for people with diabetes , with individually tailored risk information and treatment options for multiple risk factors . The aid was intended to empower patients to prioritise between clinical domains and to support treatment decisions . It was offered to participants before a regular diabetes check-up and to their healthcare provider during the consultation . Four different formats of the decision aid were included for additional explorative analyses . Main outcome measures The primary outcome was the effects on patient empowerment for setting and achieving goals . The secondary outcomes were changes in the prescribing of drugs to regulate glucose , blood pressure , lipids , and albuminuria . Data were collected through structured question naires and automated data extraction from electronic health records during six months before and after the intervention . Results Of all intervention participants , 103 ( 46 % ) reported to have received the basic elements of the intervention . For the primary outcome analysis , 199 intervention and 107 control patients with sufficient baseline and follow-up data could be included . The mean empowerment score increased 0.1 on a 5 point scale in the overall intervention group , which was not significantly different from that of the control group ( mean difference after adjusting for baseline 0.039 , 95 % confidence interval −0.056 to 0.134 ) . Lipid regulating drug treatment was intensified in 25 % of intervention and 12 % of control participants with increased cholesterol levels , which did not reach significance when the intervention was compared with the usual care group ( odds ratio 2.54 , 95 % confidence interval 0.89 to 7.23 ) . Prespecified explorative analyses showed that this effect was significant for the printed version of the decision aid in comparison to usual care ( 3.90 , 1.29 to 11.80 ) . No relevant or significant changes were seen for other treatments . Conclusion We found no evidence that the patient oriented treatment decision aid improves patient empowerment by an important amount . The aid was not used to its full extent in a substantial number of participants . Trial registration Dutch trial register NTR1942 BACKGROUND There is a lack of evidence regarding the value of tools design ed to aid decision making in patients with newly diagnosed hypertension . AIM To evaluate two interventions for assisting newly diagnosed hypertensive patients in the decision whether to start drug therapy for reducing blood pressure . DESIGN OF STUDY Factorial r and omised controlled trial . SETTING Twenty-one general practice s in south-west Engl and , UK . METHOD Adults aged 32 to 80 years with newly diagnosed hypertension were r and omised to receive either : ( a ) computerised utility assessment interview with individualized risk assessment and decision analysis ; or ( b ) information video and leaflet about high blood pressure ; or ( c ) both interventions ; or ( d ) neither intervention . Outcome measures were decisional conflict , knowledge , state anxiety , intentions regarding starting treatment , and actual treatment decision . RESULTS Of 217 patients r and omised , 212 ( 98 % ) were analysed at the primary follow-up ( mean age = 59 years , 49 % female ) . Decision analysis patients had lower decisional conflict than those who did not receive this intervention ( 27.6 versus 38.9 , 95 % confidence interval [ CI ] for adjusted difference = -13.0 to -5.8 , P < 0.001 ) , greater knowledge about hypertension ( 73 % versus 67 % , adjusted 95 % CI = 2 % to 9 % , P = 0.003 ) and no evidence of increased state anxiety ( 34.8 versus 36.8 , adjusted 95 % CI = -5.6 to 0.1 , P = 0.055 ) . Video/leaflet patients had lower decisional conflict than corresponding controls ( 30.3 versus 36.8 , adjusted 95 % CI = -7.4 to -0.6 , P = 0.021 ) , greater knowledge ( 75 % versus 65 % , adjusted 95 % CI = 6 % to 13 % , P < 0.001 ) and no evidence of increased state anxiety ( 35.7 versus 36.1 , adjusted 95 % CI = -3.9 to 1.7 , P = 0.46 ) . There were no differences between either of the interventions and their respective controls in the proportion of patients prescribed antihypertensive medication ( 67 % ) . CONCLUSIONS This trial demonstrates that , among patients facing a real treatment decision , interventions to inform patients about hypertension and to clarify patients ' values concerning outcomes of treatment are effective in reducing decisional conflict and increasing patient knowledge , while not result ing in any increases in state anxiety This study is a 3-year follow-up of a factorial r and omised controlled trial of two decision aids - decision analysis and information video plus leaflet - for newly diagnosed hypertensive patients . We found no evidence of differences for either of the two decision aids compared with controls for the primary outcome of blood pressure control at follow-up . There were also no differences in any of the secondary outcomes measured - the proportion taking blood pressure lowering drugs , self-reported medication adherence , or consulting behaviour . The r and omised controlled trial cohort as a whole , irrespective of r and omised group , demonstrated substantial reductions in blood pressure and 10-year cardiovascular risk over the follow-up period OBJECTIVE To develop and psychometrically test a brief patient-report instrument for measuring Shared Decision Making ( SDM ) in clinical encounters . METHODS We revised an existing instrument ( Shared Decision Making Question naire ; SDM-Q ) , including the generation of new items and changing the response format . A 9-item version ( SDM-Q-9 ) was developed and tested in a German primary care sample of 2351 patients via face validity ratings , investigation of acceptance , as well as factor and reliability analysis . Findings were cross-vali date d in a r and omly selected sub sample . RESULTS The SDM-Q-9 showed face validity and high acceptance . Factor analysis revealed a clearly one-dimensional nature of the underlying construct . Both item difficulties and discrimination indices proved to be appropriate . Internal consistency yielded a Cronbach 's alpha of 0.938 in the test sample . CONCLUSION The SDM-Q-9 is a reliable and well accepted instrument . Generalizability of the findings is limited by the elderly sample living in rural areas of Germany . While the current results are promising , further testing of criterion validity and administration in other population s is necessary . PRACTICE IMPLICATION S The SDM-Q-9 can be used in studies investigating the effectiveness of interventions aim ed at the implementation of SDM and as a quality indicator in health services assessment
11,040
30,419,185
Results Of 13 review s and six single studies assessing efficacy , consistent evidence of benefit was seen only with apps for diabetes , as measured by decreased glycosylated haemoglobin levels ( HbA1c ) . Some , but not all , studies showed benefit in asthma , low back pain , alcohol addiction , heart failure , ischaemic heart disease and cancer . There was no evidence of benefit in COPD , cognitive impairment or CKD . In all studies , benefits were clinical ly marginal and none related to morbid events or hospitalisation . Conclusion Evidence of clinical benefit of most available apps is very limited . Design features that enhance usability and maximise efficacy were identified . This paper presents a comprehensive review of evidence relating to the efficacy , usability and evaluation of apps for 11 common diseases aim ed at assisting patients in self-management . Consistent evidence of benefit was only seen for diabetes apps ; there was absent or conflicting evidence of benefit for apps for the remaining 10 diseases . Benefits that were detected were of marginal clinical importance , with no reporting of hard clinical end-points , such as mortality or hospitalisations . Many apps lacked design features that the literature identified as enhancing usability and potential to confer benefit .
Objective Smartphone health applications ( apps ) are being increasingly used to assist patients in chronic disease self-management . The effects of such apps on patient outcomes are uncertain , as are design features that maximise usability and efficacy , and the best methods for evaluating app quality and utility . Smartphone health apps have attracted considerable interest from patients and health managers as a means of promoting more effective self-management of chronic diseases , which leads to better health outcomes . However , most commercially available apps have never been evaluated for benefits or harms in clinical trials , and there are currently no agreed quality criteria , st and ards or regulations to ensure health apps are user-friendly , accurate in content , evidence based or efficacious . What does this paper add ? The number of smartphone apps will continue to grow , as will the appetite for patients and clinicians to use them in chronic disease self-management .
Background The use of mobile apps for health and well being promotion has grown exponentially in recent years . Yet , there is currently no app- quality assessment tool beyond “ star”-ratings . Objective The objective of this study was to develop a reliable , multidimensional measure for trialling , classifying , and rating the quality of mobile health apps . Methods A literature search was conducted to identify articles containing explicit Web or app quality rating criteria published between January 2000 and January 2013 . Existing criteria for the assessment of app quality were categorized by an expert panel to develop the new Mobile App Rating Scale ( MARS ) subscales , items , descriptors , and anchors . There were sixty well being apps that were r and omly selected using an iTunes search for MARS rating . There were ten that were used to pilot the rating procedure , and the remaining 50 provided data on interrater reliability . Results There were 372 explicit criteria for assessing Web or app quality that were extracted from 25 published papers , conference proceedings , and Internet re sources . There were five broad categories of criteria that were identified including four objective quality scales : engagement , functionality , aesthetics , and information quality ; and one subjective quality scale ; which were refined into the 23-item MARS . The MARS demonstrated excellent internal consistency ( alpha = .90 ) and interrater reliability intraclass correlation coefficient ( ICC = .79 ) . Conclusions The MARS is a simple , objective , and reliable tool for classifying and assessing the quality of mobile health apps . It can also be used to provide a checklist for the design and development of new high quality health apps Purpose Patients undergoing radiotherapy for prostate cancer suffer from a variety of symptoms which influence health-related quality of life . We have developed an application ( Interaktor ) for smartphones and tablets for early detection , reporting and management of symptoms , and concerns during treatment for prostate cancer . The study evaluates the effect on symptom burden and quality of life when using the application for real-time symptom assessment and management during radiotherapy for localized prostate cancer . Methods A non-r and omized controlled study was used at two university hospitals in Sweden where 64 patients constituted a control group and 66 patients made up an intervention group . The intervention group was asked to report symptoms via the application daily during the treatment as well as 3 weeks after . The EORTC QLQ-C30 and its module PR25 and the Sense of Coherence question naire were administered at three time points in both groups . Results The intervention group rated significantly lower levels of fatigue and nausea at the end of radiotherapy . Moreover , they had significantly less burden in emotional functioning , insomnia , and urinary-related symptoms at the end of treatment as well as 3 months later compared with the control group . In the multivariate analyses , with education and sense of coherence as covariates , the intervention group still significantly rated emotional functioning ( p = 0.007 ) , insomnia ( p = 0.017 ) , and urinary-related symptoms ( p = 0.008 ) as better than the control group at T2 . ConclusionS tudy findings suggest that Interaktor could be an efficient mHealth tool for facilitating supportive care needs during cancer treatment Background Mobile health apps for diabetes self-management have different functions . However , the efficacy and safety of each function are not well studied , and no classification is available for these functions . Objective The aims of this study were to ( 1 ) develop and vali date a taxonomy of apps for diabetes self-management , ( 2 ) investigate the glycemic efficacy of mobile app-based interventions among adults with diabetes in a systematic review of r and omized controlled trials ( RCTs ) , and ( 3 ) explore the contribution of different function to the effectiveness of entire app-based interventions using the taxonomy . Methods We developed a 3-axis taxonomy with columns of clinical modules , rows of functional modules and cells of functions with risk assessment s. This taxonomy was vali date d by review ing and classifying commercially available diabetes apps . We search ed MEDLINE , EMBASE , the Cochrane Central Register of Controlled Trials , the Chinese Biomedical Literature Data base , and Clinical Trials.gov from January 2007 to May 2016 . We included RCTs of adult out patients with diabetes that compared using mobile app-based interventions with usual care alone . The mean differences ( MDs ) in hemoglobin A1c ( HbA1c ) concentrations and risk ratios of adverse events were pooled using a r and om-effects meta- analysis . After taxonomic classification , we performed exploratory subgroup analyses of the presence or absence of each module across the included app-based interventions . Results Across 12 included trials involving 974 participants , using app-based interventions was associated with a clinical ly significant reduction of HbA1c ( MD 0.48 % , 95 % CI 0.19%-0.78 % ) without excess adverse events . Larger HbA1c reductions were noted among patients with type 2 diabetes than those with type 1 diabetes ( MD 0.67 % , 95 % CI 0.30%-1.03 % vs MD 0.37 % , 95 % CI –0.12%-0.86 % ) . Having a complication prevention module in app-based interventions was associated with a greater HbA1c reduction ( with complication prevention : MD 1.31 % , 95 % CI 0.66%-1.96 % vs without : MD 0.38 % , 95 % CI 0.09%-0.67 % ; intersubgroup P=.01 ) , as was having a structured display ( with structured display : MD 0.69 % , 95 % CI 0.32%-1.06 % vs without : MD 0.69 % , 95 % CI –0.18%-0.53 % ; intersubgroup P=.03 ) . However , having a clinical decision-making function was not associated with a larger HbA1c reduction ( with clinical decision making : MD 0.19 % , 95 % CI –0.24%-0.63 % vs without : MD 0.61 % , 95 % CI 0.27%-0.95 % ; intersubgroup P=.14 ) . Conclusions The use of mobile app-based interventions yields a clinical ly significant HbA1c reduction among adult out patients with diabetes , especially among those with type 2 diabetes . Our study suggests that the clinical decision-making function needs further improvement and evaluation before being added to apps Background Brief interventions via the internet have been shown to reduce university students ’ alcohol intake . This study tested two smartphone applications ( apps ) targeting drinking choices on party occasions , with the goal of reducing problematic alcohol intake among Swedish university students . Methods Students were recruited via e-mails sent to student union members at two universities . Those who gave informed consent , had a smartphone , and showed risky alcohol consumption according to the Alcohol Use Disorders Identification Test ( AUDIT ) were r and omized into three groups . Group 1 had access to the Swedish government alcohol monopoly ’s app , Promillekoll , offering real-time estimated blood alcohol concentration ( eBAC ) calculation ; Group 2 had access to a web-based app , PartyPlanner , developed by the research group , offering real-time eBAC calculation with planning and follow-up functions ; and Group 3 participants were controls . Follow-up was conducted at 7 weeks . Results Among 28574 students offered participation , 4823 agreed to join ; 415 were excluded due to incomplete data , and 1932 fulfilled eligibility criteria for r and omization . Attrition was 22.7–39.3 percent , higher among heavier drinkers and highest in Group 2 . Self-reported app use was higher in Group 1 ( 74 % ) compared to Group 2 ( 41 % ) . Per- protocol analyses revealed only one significant time-by-group interaction , where Group 1 participants increased the frequency of their drinking occasions compared to controls ( p = 0.001 ) . Secondary analyses by gender showed a significant difference among men in Group 1 for frequency of drinking occasions per week ( p = 0.001 ) , but not among women . Among all participants , 29 percent showed high-risk drinking , over the recommended weekly drinking levels of 9 ( women ) and 14 ( men ) st and ard glasses . Conclusions Smartphone apps can make brief interventions available to large numbers of university students . The apps studied using eBAC calculation did not , however , seem to affect alcohol consumption among university students and one app may have led to a negative effect among men . Future research should : 1 ) explore ways to increase user retention , 2 ) include apps facilitating technical manipulation for evaluation of added components , 3 ) explore the effects of adapting app content to possible gender differences , and 4 ) offer additional interventions to high-risk users . Trial registration clinical trials.gov : NCT01958398 BACKGROUND Nonpharmacological intervention for individuals with mild cognitive impairment ( MCI ) needs further investigation . OBJECTIVE Test efficacy of an eight-week Chinese calligraphy writing training course in improving attentional control and working memory . METHODS Ninety-nine participants with MCI were r and omized into the eight-week calligraphy writing ( n = 48 ) or control ( tablet computer ) training ( n = 51 ) . Outcomes of the interventions were attentional control , working memory , visual scan and processing speed . They were measured at baseline , post-training , and six-month follow-up . RESULTS Calligraphy writing , when compared with control , significantly improved working memory as reflected from DST-Backward sequence ( p = 0.009 ) and span scores ( p = 0.002 ) , and divided attention as reflected from CTT2 ( p < 0.001 ) , and at the post-training . The unique improvement in working memory ( span : p < 0.001 ; sequence : p = 0.008 ) of the intervention group was also found at follow-up when comparing with those at baseline . Changes in the other outcome measures were not statistically significant . CONCLUSION The findings provide support that Chinese calligraphy writing training for eight weeks using a cognitive approach would improve working memory and to a lesser extent attentional control functions of patients with early MCI . They also demonstrate the usefulness of using mind- and -body practice for improving specific cognitive functions BACKGROUND AND OBJECTIVES Patient self-management has been shown to improve health outcomes . We developed a smartphone-based system to boost self-care by patients with CKD and integrated its use into usual CKD care . We determined its acceptability and examined changes in several clinical parameters . DESIGN , SETTING , PARTICIPANTS , & MEASUREMENTS We recruited patients with stage 4 or 5 CKD attending outpatient renal clinics who responded to a general information newsletter about this 6-month proof-of-principle study . The smartphone application targeted four behavioral elements : monitoring BP , medication management , symptom assessment , and tracking laboratory results . Prebuilt customizable algorithms provided real-time personalized patient feedback and alerts to providers when predefined treatment thresholds were crossed or critical changes occurred . Those who died or started RRT within the first 2 months were replaced . Only participants followed for 6 months after recruitment were included in assessing changes in clinical measures . RESULTS In total , 47 patients ( 26 men ; mean age = 59 years old ; 33 % were ≥65 years old ) were enrolled ; 60 % had never used a smartphone . User adherence was high ( > 80 % performed ≥80 % of recommended assessment s ) and sustained . The mean reductions in home BP readings between baseline and exit were statistically significant ( systolic BP , -3.4 mmHg ; 95 % confidence interval , -5.0 to -1.8 and diastolic BP , -2.1 mmHg ; 95 % confidence interval , -2.9 to -1.2 ) ; 27 % with normal clinic BP readings had newly identified masked hypertension . One hundred twenty-seven medication discrepancies were identified ; 59 % were medication errors that required an intervention to prevent harm . In exit interviews , patients indicated feeling more confident and in control of their condition ; clinicians perceived patients to be better informed and more engaged . CONCLUSIONS Integrating a smartphone-based self-management system into usual care of patients with advanced CKD proved feasible and acceptable , and it appeared to be clinical ly useful . The results provide a strong rationale for a r and omized , controlled trial We performed a pilot r and omized , controlled trial of intensive , computer-based cognitive training in 47 subjects with mild cognitive impairment . The intervention group performed exercises specifically design ed to improve auditory processing speed and accuracy for 100 min/d , 5 d/wk for 6 weeks ; the control group performed more passive computer activities ( reading , listening , visuospatial game ) for similar amounts of time . Subjects had a mean age of 74 years and 60 % were men ; 77 % successfully completed training . On our primary outcome , Repeatable Battery for Assessment of Neuropsychological Status total scores improved 0.36 st and ard deviations ( SD ) in the intervention group ( P=0.097 ) compared with 0.03 SD in the control group ( P=0.88 ) for a nonsignificant difference between the groups of 0.33 SD ( P=0.26 ) . On 12 secondary outcome measures , most differences between the groups were not statistically significant . However , we observed a pattern in which effect sizes for verbal learning and memory measures tended to favor the intervention group whereas effect sizes for language and visuospatial function measures tended to favor the control group , which raises the possibility that these training programs may have domain-specific effects . We conclude that intensive , computer-based mental activity is feasible in subjects with mild cognitive impairment and that larger trials are warranted Objective To determine whether mobile phone based monitoring improves asthma control compared with st and ard paper based monitoring strategies . Design Multicentre r and omised controlled trial with cost effectiveness analysis . Setting UK primary care . Participants 288 adolescents and adults with poorly controlled asthma ( asthma control question naire ( ACQ ) score ≥1.5 ) from 32 practice s. Intervention Participants were central ly r and omised to twice daily recording and mobile phone based transmission of symptoms , drug use , and peak flow with immediate feedback prompting action according to an agreed plan or paper based monitoring . Main outcome measures Changes in scores on asthma control question naire and self efficacy ( knowledge , attitude , and self efficacy asthma question naire ( KASE-AQ ) ) at six months after r and omisation . Assessment of outcomes was blinded . Analysis was on an intention to treat basis . Results There was no significant difference in the change in asthma control or self efficacy between the two groups ( ACQ : mean change 0.75 in mobile group v 0.73 in paper group , mean difference in change −0.02 ( 95 % confidence interval −0.23 to 0.19 ) ; KASE-AQ score : mean change −4.4 v −2.4 , mean difference 2.0 ( −0.3 to 4.2 ) ) . The numbers of patients who had acute exacerbations , steroid courses , and unscheduled consultations were similar in both groups , with similar healthcare costs . Overall , the mobile phone service was more expensive because of the expenses of telemonitoring . Conclusions Mobile technology does not improve asthma control or increase self efficacy compared with paper based monitoring when both groups received clinical care to guidelines st and ards . The mobile technology was not cost effective . Trial registration Clinical Trials NCT00512837 Objectives Evaluating and comparing the effectiveness of two smartphone-delivered treatments : one based on behavioural activation ( BA ) and other on mindfulness . Design Parallel r and omised controlled , open , trial . Participants were allocated using an online r and omisation tool , h and led by an independent person who was separate from the staff conducting the study . Setting General community , with recruitment nationally through mass media and advertisements . Participants 40 participants diagnosed with major depressive disorder received a BA treatment , and 41 participants received a mindfulness treatment . 9 participants were lost at the post-treatment . Intervention BA : An 8-week long behaviour programme administered via a smartphone application . Mindfulness : An 8-week long mindfulness programme , administered via a smartphone application . Main outcome measures The Beck Depression Inventory-II ( BDI-II ) and the nine-item Patient Health Question naire Depression Scale ( PHQ-9 ) . Results 81 participants were r and omised ( mean age 36.0 years ( SD=10.8 ) ) and analysed . Results showed no significant interaction effects of group and time on any of the outcome measures either from pretreatment to post-treatment or from pretreatment to the 6-month follow-up . Subgroup analyses showed that the BA treatment was more effective than the mindfulness treatment among participants with higher initial severity of depression from pretreatment to the 6-month follow-up ( PHQ-9 : F ( 1 , 362.1)=5.2 , p<0.05 ) . In contrast , the mindfulness treatment worked better than the BA treatment among participants with lower initial severity from pretreatment to the 6-month follow-up ( PHQ-9 : F ( 1 , 69.3)=7.7 , p<0.01 ) ; BDI-II : ( F(1 , 53.60)=6.25 , p<0.05 ) . Conclusions The two interventions did not differ significantly from one another . For participants with higher severity of depression , the treatment based on BA was superior to the treatment based on mindfulness . For participants with lower initial severity , the treatment based on mindfulness worked significantly better than the treatment based on BA . Trial registration Clinical Trials NCT01463020 Objective Cardiac rehabilitation ( CR ) is pivotal in preventing recurring events of myocardial infa rct ion ( MI ) . This study aims to investigate the effect of a smartphone-based home service delivery ( Care Assessment Platform ) of CR ( CAP-CR ) on CR use and health outcomes compared with a traditional , centre-based programme ( TCR ) in post-MI patients . Methods In this unblinded r and omised controlled trial , post-MI patients were r and omised to TCR ( n=60 ; 55.7±10.4 years ) and CAP-CR ( n=60 ; 55.5±9.6 years ) for a 6-week CR and 6-month self-maintenance period . CAP-CR , delivered in participants ’ homes , included health and exercise monitoring , motivational and educational material delivery , and weekly mentoring consultations . CAP-CR uptake , adherence and completion rates were compared with TCR using intention-to-treat analyses . Changes in clinical outcomes ( modifiable lifestyle factors , biomedical risk factors and health-related quality of life ) across baseline , 6 weeks and 6 months were compared within , and between , groups using linear mixed model regression . Results CAP-CR had significantly higher uptake ( 80 % vs 62 % ) , adherence ( 94 % vs 68 % ) and completion ( 80 % vs 47 % ) rates than TCR ( p<0.05 ) . Both groups showed significant improvements in 6-minute walk test from baseline to 6 weeks ( TCR : 537±86–584±99 m ; CAP-CR : 510±77–570±80 m ) , which was maintained at 6 months . CAP-CR showed slight weight reduction ( 89±20–88±21 kg ) and also demonstrated significant improvements in emotional state ( K10 : median ( IQR ) 14.6 ( 13.4–16.0 ) to 12.6 ( 11.5–13.8 ) ) , and quality of life ( EQ5D-Index : median ( IQR ) 0.84 ( 0.8–0.9 ) to 0.92 ( 0.9–1.0 ) ) at 6 weeks . Conclusions This smartphone-based home care CR programme improved post-MI CR uptake , adherence and completion . The home-based CR programme was as effective in improving physiological and psychological health outcomes as traditional CR . CAP-CR is a viable option towards optimising use of CR services . Trial registration number ANZCTR12609000251224 Background Internet-based interventions are increasingly used to support self-management of individuals with chronic illnesses . Web-based interventions may also be effective in enhancing self-management for individuals with chronic pain , but little is known about long-term effects . Research on Web-based interventions to support self-management following participation in pain management programs is limited . Objective The aim is to examine the long-term effects of a 4-week smartphone-intervention with diaries and therapist-written feedback following an inpatient chronic pain rehabilitation program , previously found to be effective at short-term and 5-month follow-ups . Methods 140 women with chronic widespread pain , participating in a 4-week inpatient rehabilitation program , were r and omized into two groups : with or without a smartphone intervention after the rehabilitation . The smartphone intervention consisted of one face-to-face individual session and 4 weeks of written communication via a smartphone , consisting of three diaries daily to elicit pain-related thoughts , feelings , and activities , as well as daily personalized written feedback based on cognitive behavioral principles from a therapist . Both groups were given access to an informational website to promote constructive self-management . Outcomes were measured with self-reported paper- and -pencil format question naires with catastrophizing as the primary outcome measure . Secondary outcomes included daily functioning and symptom levels , acceptance of pain , and emotional distress . Results By the 11-month follow-up , the favorable between-group differences previously reported post-intervention and at 5-month follow-up on catastrophizing , acceptance , functioning , and symptom level were no longer evident ( P>.10 ) . However , there was more improvement in catastrophizing scores during the follow-up period in the intervention group ( M=-2.36 , SD 8.41 ) compared to the control group ( M=.40 , SD 7.20 ) , P=.045 . Also , per protocol within-group analysis showed a small positive effect ( Cohen ’s d=.33 ) on catastrophizing in the intervention group ( P=.04 ) and no change in the control group from the smartphone intervention baseline to 11-month follow-up . A positive effect ( Cohen ’s d=.73 ) on acceptance was found within the intervention group ( P<.001 ) but not in the control group . Small to large negative effects were found within the control group on functioning and symptom levels , emotional distress , and fatigue ( P=.05 ) from the intervention baseline to the 11-month follow-up . Conclusion The long-term results of this r and omized trial are ambiguous . No significant between-group effect was found on the study variables at 11-month follow-up . However , the within-group analyses , comparing the baseline for the smartphone intervention to the 11-month data , indicated changes in the desired direction in catastrophizing and acceptance in the intervention group but not within the control group . This study provides modest evidence supporting the long-term effect of the intervention . Trial Registration Clinical trials.gov NCT01236209 ; http://www . clinical trials.gov/ct2/show/NCT01236209 ( Archived by WebCite at http://www.webcitation.org/6FF7KUXo0 Purpose To investigate and compare the effects of mobile health ( mHealth ) and pedometer with conventional exercise program using a brochure on physical function and quality of life ( QOL ) . Methods The study was a prospect i ve , quasi-r and omized multicenter trial where 356 patients whose cancer treatment had been terminated were enrolled . All patients were instructed to perform a 12-week regimen of aerobic and resistance exercise . The mHealth group received a pedometer and a newly developed smartphone application to provide information and monitor the prescribed exercises . Those in the conventional group received an exercise brochure . Physical measurements were conducted at baseline , 6 weeks , and 12 weeks . Self-reported physical activity ( international physical activity question naire-short form ) , general QOL ( European Organization for Research and Treatment of Cancer Quality of Life Question naire Core 30 ) , and breast cancer-specific QOL ( Quality of Life Question naire Breast Cancer Module 23 ) were assessed at baseline and 12 weeks . A user satisfaction survey was assessed in the mHealth group . Results Basic characteristics were not different between the two groups except for age and previous radiotherapy . Physical function , physical activity , and QOL scores were significantly improved regardless of the intervention method , and changes were not significantly different between the two groups . Additionally , the mean Likert scale response for overall satisfaction with the service was 4.27/5 in the mHealth group . Conclusions Overall , both the mHealth coupled with pedometer and conventional exercise education using a brochure were effective in improving physical function , physical activity , and QOL . This study provides a basis of mHealth research in breast cancer patients for progressing further developing field , although superiority of the mHealth over the conventional program was not definitely evident Nearly all college student smokers also drink alcohol , and smoking and heavy episodic drinking ( HED ) commonly co-occur . However , few studies have examined the factors that concurrently influence smoking and HED among college students and , to date , no interventions have been developed that target both HED and smoking in this population . The objective of the current study was to develop and evaluate a mobile feedback intervention that targets HED and smoking . Participants ( N = 94 ) were non-treatment-seeking college students ( M(age ) = 20.5 years , SD = 1.7 ) who engaged in at least a single HED episode in the past 2 weeks and reported concurrent smoking and drinking at least once a week . Participants were r and omized to receive either the mobile intervention for 14 days , complete mobile assessment s ( without intervention ) for 14 days , or complete minimal assessment s ( without intervention or mobile assessment s ) . At a 1-month follow-up , compared with the minimal assessment condition , we observed significant reductions in the number of cigarettes per smoking day in both the mobile intervention ( d = 0.55 ) and mobile assessment ( d = 0.45 ) conditions . Among those r and omized to the mobile intervention , receiving more modules of the intervention was significantly associated with a lower likelihood of any drinking during the 14-day assessment period and significant reductions in smoking at 1-month follow-up . The mobile intervention did not result in significant reductions in HED or concurrent smoking and drinking . Future research should continue to examine ways of using technology and the real-time environment to improve interventions for HED and smoking Background Heart failure ( HF ) patients suffer from frequent and repeated hospitalizations , causing a substantial economic burden on society . Hospitalizations can be reduced considerably by better compliance with self-care . Home telemonitoring has the potential to boost patients ’ compliance with self-care , although the results are still contradictory . Objective A r and omized controlled trial was conducted in order to study whether the multidisciplinary care of heart failure patients promoted with telemonitoring leads to decreased HF-related hospitalization . Methods HF patients were eligible whose left ventricular ejection fraction was lower than 35 % , NYHA functional class ≥2 , and who needed regular follow-up . Patients in the telemonitoring group ( n=47 ) measured their body weight , blood pressure , and pulse and answered symptom-related questions on a weekly basis , reporting their values to the heart failure nurse using a mobile phone app . The heart failure nurse followed the status of patients weekly and if necessary contacted the patient . The primary outcome was the number of HF-related hospital days . Control patients ( n=47 ) received multidisciplinary treatment according to st and ard practice s. Patients ’ clinical status , use of health care re sources , adherence , and user experience from the patients ’ and the health care professionals ’ perspective were studied . Results Adherence , calculated as a proportion of weekly su bmi tted self- measurements , was close to 90 % . No difference was found in the number of HF-related hospital days ( incidence rate ratio [IRR]=0.812 , P=.351 ) , which was the primary outcome . The intervention group used more health care re sources : they paid an increased number of visits to the nurse ( IRR=1.73 , P<.001 ) , spent more time at the nurse reception ( mean difference of 48.7 minutes , P<.001 ) , and there was a greater number of telephone contacts between the nurse and intervention patients ( IRR=3.82 , P<.001 for nurse-induced contacts and IRR=1.63 , P=.049 for patient-induced contacts ) . There were no statistically significant differences in patients ’ clinical health status or in their self-care behavior . The technology received excellent feedback from the patient and professional side with a high adherence rate throughout the study . Conclusions Home telemonitoring did not reduce the number of patients ’ HF-related hospital days and did not improve the patients ’ clinical condition . Patients in the telemonitoring group contacted the Cardiology Outpatient Clinic more frequently , and on this way increased the use of health care re sources . Trial Registration Clinical trials.gov NCT01759368 ; http:// clinical trials.gov/show/NCT01759368 ( Archived by WebCite at http://www.webcitation.org/6UFxiCk8Z ) Background This paper reports the results of a pilot r and omized controlled trial comparing the delivery modality ( mobile phone/tablet or fixed computer ) of a cognitive behavioural therapy intervention for the treatment of depression . The aim was to establish whether a previously vali date d computerized program ( The Sadness Program ) remained efficacious when delivered via a mobile application . Method 35 participants were recruited with Major Depression ( 80 % female ) and r and omly allocated to access the program using a mobile app ( on either a mobile phone or iPad ) or a computer . Participants completed 6 lessons , weekly homework assignments , and received weekly email contact from a clinical psychologist or psychiatrist until completion of lesson 2 . After lesson 2 email contact was only provided in response to participant request , or in response to a deterioration in psychological distress scores . The primary outcome measure was the Patient Health Question naire 9 ( PHQ-9 ) . Of the 35 participants recruited , 68.6 % completed 6 lessons and 65.7 % completed the 3-months follow up . Attrition was h and led using mixed-model repeated- measures ANOVA . Results Both the Mobile and Computer Groups were associated with statistically significantly benefits in the PHQ-9 at post-test . At 3 months follow up , the reduction seen for both groups remained significant . Conclusions These results provide evidence to indicate that delivering a CBT program using a mobile application , can result in clinical ly significant improvements in outcomes for patients with depression . Trial registration Australian New Zeal and Clinical Trials Registry ACTRN Commercial mobile apps for health behavior change are flourishing in the marketplace , but little evidence exists to support their use . This paper summarizes methods for evaluating the content , usability , and efficacy of commercially available health apps . Content analyses can be used to compare app features with clinical guidelines , evidence -based protocol s , and behavior change techniques . Usability testing can establish how well an app functions and serves its intended purpose for a target population . Observational studies can explore the association between use and clinical and behavioral outcomes . Finally , efficacy testing can establish whether a commercial app impacts an outcome of interest via a variety of study design s , including r and omized trials , multiphase optimization studies , and N-of-1 studies . Evidence in all these forms would increase adoption of commercial apps in clinical practice , inform the development of the next generation of apps , and ultimately increase the impact of commercial apps The self-management of asthma can improve clinical outcomes . Recently , mobile telephones have been widely used as an efficient , instant personal communication tool . This study investigated whether a self-care system will achieve better asthma control through a mobile telephone-based interactive programme . This was a prospect i ve , controlled study in outpatient clinics . From 120 consecutive patients with moderate-to-severe persistent asthma , 89 were eventually recruited for the study , with 43 in the mobile telephone group ( with a mobile telephone-based interactive asthma self-care system ) . In the mobile telephone group , mean±sem peak expiratory flow rate significantly increased at 4 ( 378.2±9.3 L·min−1 ; n = 43 ; p = 0.020 ) , 5 ( 378.2±9.2 L·min−1 ; n = 43 ; p = 0.008 ) and 6 months ( 382.7±8.6 L·min−1 ; n = 43 ; p = 0.001 ) compared to the control group . Mean±sem forced expiratory volume in 1 s significantly increased at 6 months ( 65.2±3.2 % predicted ; n = 43 ; p<0.05 ) . Patients in the mobile telephone group had better quality of life after 3 months , as determined using the Short Form-12 ® physical component score , and fewer episodes of exacerbation and unscheduled visits than the control group . Patients in the mobile telephone group significantly increased their mean daily dose of either systemic or inhaled corticosteroids compared with the control group . The mobile telephone-based interactive self-care system provides a convenient and practical self-monitoring and -management of asthma , and improves asthma control Abstract Purpose : This research aim ed to integrate three previously developed assistive technology ( AT ) systems into one modular , multifunctional system , which can support people with dementia and carers throughout the course of dementia . . In an explorative evaluation study , the integrated system , called Rosetta , was tested on usefulness , user-friendliness and impact , in people with dementia , their informal carers and professional carers involved . The Rosetta system was installed in participants ‘ homes in three countries : The Netherl and s , Germany and Belgium . Methods : Controlled trial with pre- and post-test measures across three countries ( r and omized controlled trial in Germany ; matched groups in the Netherl and s and Belgium ) . Participants completed question naires for impact measurement and participated in semi-structured interviews regarding usefulness and user-friendliness of Rosetta . Results : All participants agreed that Rosetta is a very useful development . They did not rate the user-friendliness of the system highly . No significant effects were found on impact measurements . Conclusion : All participants found Rosetta a very useful development for future care , and would consider using it . Since Rosetta was still in development during evaluation , a discrepancy between expectations and actual functioning of Rosetta existed , which may explain the lack of findings on the impact of the system and the low appreciation of user-friendliness . Implication s for Rehabilitation People with dementia and carers find assistive technology ( AT ) a useful future development and they are willing to use it in the future . People with dementia and carers have little privacy issues with AT . If they have concerns , they are willing to accept the trade-off of reduced privacy in exchange for the ability to live in their own homes for longer . Given that a system works flawlessly , informal carers indicate that integrated AT can reduce their burden and stress . This can in turn help informal carers to provide better care for a longer period of time BACKGROUND Patients with myocardial infa rct ion ( MI ) seldom reach recommended targets for secondary prevention . This study evaluated a smartphone application ( " app " ) aim ed at improving treatment adherence and cardiovascular lifestyle in MI patients . DESIGN Multicenter , r and omized trial . METHODS A total of 174 ticagrelor-treated MI patients were r and omized to either an interactive patient support tool ( active group ) or a simplified tool ( control group ) in addition to usual post-MI care . Primary end point was a composite nonadherence score measuring patient-registered ticagrelor adherence , defined as a combination of adherence failure events ( 2 missed doses registered in 7-day cycles ) and treatment gaps ( 4 consecutive missed doses ) . Secondary end points included change in cardiovascular risk factors , quality of life ( European Quality of Life-5 Dimensions ) , and patient device satisfaction ( System Usability Scale ) . RESULTS Patient mean age was 58 years , 81 % were men , and 21 % were current smokers . At 6 months , greater patient-registered drug adherence was achieved in the active vs the control group ( nonadherence score : 16.6 vs 22.8 [ P = .025 ] ) . Numerically , the active group was associated with higher degree of smoking cessation , increased physical activity , and change in quality of life ; however , this did not reach statistical significance . Patient satisfaction was significantly higher in the active vs the control group ( system usability score : 87.3 vs 78.1 [ P = .001 ] ) . CONCLUSIONS In MI patients , use of an interactive patient support tool improved patient self-reported drug adherence and may be associated with a trend toward improved cardiovascular lifestyle changes and quality of life . Use of a disease-specific interactive patient support tool may be an appreciated , simple , and promising complement to st and ard secondary prevention In this article two new methods for building and evaluating eHealth interventions are described . The first is the Multiphase Optimization Strategy ( MOST ) . It consists of a screening phase , in which intervention components are efficiently identified for inclusion in an intervention or for rejection , based on their performance ; a refining phase , in which the selected components are fine tuned and issues such as optimal levels of each component are investigated ; and a confirming phase , in which the optimized intervention , consisting of the selected components delivered at optimal levels , is evaluated in a st and ard r and omized controlled trial . The second is the Sequential Multiple Assignment R and omized Trial ( SMART ) , which is an innovative research design especially suited for building time-varying adaptive interventions . A SMART trial can be used to identify the best tailoring variables and decision rules for an adaptive intervention empirically . Both the MOST and SMART approaches use r and omized experimentation to enable valid inferences . When properly implemented , these approaches will lead to the development of more potent eHealth interventions Abstract Objectives . To evaluate whether a new home intervention system ( HIS , OPTILOGG ® ) consisting of a specialised software , a tablet computer ( tablet ) wirelessly connected to a weight scale may improve self-care behaviour , health-related quality of life ( HRQoL ) , knowledge about heart failure ( HF ) and reduce hospital days due to HF . Design . 82 patients ( 32 % females ) with mean age : 75 ± 8 years hospitalised with HF were r and omised at discharge to an intervention group ( IG ) equipped with the HIS or to a control group ( CG ) receiving st and ard HF information only . The tablet contained information about HF and lifestyle advice according to current guidelines . It also showed present dose of diuretic , changes in patient-measured weight and HRQoL over time . Results . After 3 months the IG displayed a dramatic improvement in self-care with p < 0.05 ( median IG : 17 [ IQR : 13 , 22 ] and CG : 21 [ IQR : 17 , 25 ] ) . The disease-specific HRQoL was measured by Kansas City Cardiomyopathy Question naire . The IG had significantly higher score ( median IG : 65.1 [ IQR : 38.5 , 83.3 ] vs. CG : 52.1 [ IQR : 41.1 , 64.1 ] p < 0.05 ) and an improved physical limitation ( median IG : 54.2 [ IQR : 37.7 , 83.3 ] vs. CG : 45.8 [ IQR : 25.0 , 54.2 ] p < 0.05 ) There was no difference in knowledge . IG showed fewer HF-related days in the hospital , with 1.3 HF-related hospital days/patient versus 3.5 in CG ( risk ratio : 0.38 ; 95 % confidence interval : 0.31–0.46 ; p < 0.05 ) . Conclusion . HF patients with a HIS tablet computer and scale improved in self-care and HRQoL. Days in hospital due to HF were reduced . A medical device that is easy to use can be a valuable tool for improving self-care and outcome in patients with HF OBJECTIVE To evaluate smartphone apps intended for self-management of pain using quality assessment criteria and usability testing with prospect i ve users . DESIGN 1 ) Survey and content analysis of available apps ; and 2 ) individual usability study of two apps . SETTING University of Leeds , United Kingdom . PARTICIPANTS Forty-one participants ( aged 19 - 59 years ) with experience of chronic or recurrent pain episodes . METHODS We undertook a survey , content analysis , and quality appraisal of all currently available mobile phone apps for self-management of pain . Two apps were then selected and assessed with usability testing . RESULTS Twelve apps met the inclusion criteria . The quality assessment revealed wide variation in their clinical content , interface design , and usability to support self-management of pain . Very little user or clinician involvement was identified in the development of the apps . From the usability testing , participants stated a preference for an interface design employing a lighter color scheme and particular text font . Although very few participants were aware of pain-reporting apps prior to participation , many would consider use in the future . CONCLUSIONS Variation in app quality and a lack of user and clinician engagement in development were found across the pain apps in this research . Usability testing identified a range of user preferences . Although useful information was obtained , it would be beneficial to involve users earlier in the process of development , as well as establishing ways to merge end user requirements with evidence -based content , to provide high- quality and usable apps for self-management of pain Objective : First , to investigate the effects of a telerehabilitation intervention on health status and activity level of patients with Chronic Obstructive Pulmonary Disease ( COPD ) , compared to usual care . Second , to investigate how patients comply with the intervention and whether compliance is related to treatment outcomes . Design : a r and omized controlled pilot trial Subjects : Thirty-four patients diagnosed with COPD . Intervention : The telerehabilitation application consists of an activity coach ( 3D-accelerometer with smartphone ) for ambulant activity registration and real-time feedback , complemented by a web portal with a symptom diary for self-treatment of exacerbations . The intervention group used the application for 4 weeks . The control group received usual care . Main measures : Activity level measured by a pedometer ( in steps/day ) , health status by the Clinical COPD Question naire at baseline and after intervention . Compliance was expressed as the time the activity coach was worn . Results : Fourteen intervention and 16 control patients completed the study . Activity level ( steps/day ) was not significantly affected by the intervention over time . There was a non-significant difference in improvement in health status between the intervention ( −0.34±0.55 ) and control group ( 0.02±0.57 , p=0.10 ) . Health status significantly improved within the intervention group ( p=0.05 ) . The activity coach was used more than prescribed ( 108 % ) and compliance was related to the increase in activity level for the first two feedback weeks ( r=0.62 , p=0.03 ) . Conclusions : This pilot study shows the potential of the telerehabilitation intervention : compliance with the activity coach was high , which directly related to an improvement in activity levels
11,041
31,399,956
There were significant improvements in peak oxygen uptake , 6-min walking test distance , and ventilatory threshold , whereas quality of life and echocardiographic parameters improved only in some studies . Endothelial function/arterial stiffness remained unchanged . No adverse events were reported . Appropriate exercise programs are able to get a favorable cardiovascular outcome in patients with HFpEF . This could also benefit in terms of quality of life , even if more controversial .
Physical activity is associated with a lower risk of adverse cardiovascular outcomes , including heart failure ( HF ) . Exercise training is a class IA level recommendation in patients with stable HF , but its impact is less clear in heart failure with preserved ejection fraction ( HFpEF ) . The aim of this study was to analyze the effects of the exercise training on cardiovascular outcomes in patients with HFpEF .
OBJECTIVES We sought to determine whether structured exercise training ( ET ) improves maximal exercise capacity , left ventricular diastolic function , and quality of life ( QoL ) in patients with heart failure with preserved ejection fraction ( HFpEF ) . BACKGROUND Nearly one-half of patients with heart failure experience HFpEF , but effective therapeutic strategies are sparse . METHODS A total of 64 patients ( age 65 ± 7 years , 56 % female ) with HFpEF were prospect ively r and omized ( 2:1 ) to supervised endurance/resistance training in addition to usual care ( ET , n = 44 ) or to usual care alone ( UC ) ( n = 20 ) . The primary endpoint was the change in peak Vo(2 ) after 3 months . Secondary endpoints included effects on cardiac structure , diastolic function , and QoL. RESULTS Peak Vo(2 ) increased ( 16.1 ± 4.9 ml/min/kg to 18.7 ± 5.4 ml/min/kg ; p < 0.001 ) with ET and remained unchanged ( 16.7 ± 4.7 ml/min/kg to 16.0 ± 6.0 ml/min/kg ; p = NS ) with UC . The mean benefit of ET was 3.3 ml/min/kg ( 95 % confidence interval [ CI ] : 1.8 to 4.8 , p < 0.001 ) . E/e ' ( mean difference of changes : -3.2 , 95 % CI : -4.3 to -2.1 , p < 0.001 ) and left atrial volume index ( milliliters per square meter ) decreased with ET and remained unchanged with UC ( -4.0 , 95 % CI : -5.9 to -2.2 , p < 0.001 ) . The physical functioning score ( 36-Item Short-Form Health Survey ) improved with ET and remained unchanged with UC ( 15 , 95 % CI : 7 to 24 , p < 0.001 ) . The ET-induced decrease of E/e ' was associated with 38 % gain in peak Vo(2 ) and 50 % of the improvement in physical functioning score . CONCLUSIONS Exercise training improves exercise capacity and physical dimensions of QoL in HFpEF . This benefit is associated with atrial reverse remodeling and improved left ventricular diastolic function . ( Exercise Training in Diastolic Heart Failure-Pilot Study : A Prospect i ve , R and omised , Controlled Study to Determine the Effects of Physical Training on Exercise Capacity and Quality of Life [ Ex-DHF-P ] ; IS RCT N42524037 ) Background —Heart failure ( HF ) with preserved left ventricular ejection fraction ( HFPEF ) is the most common form of HF in the older population . Exercise intolerance is the primary chronic symptom in patients with HFPEF and is a strong determinant of their reduced quality of life ( QOL ) . Exercise training ( ET ) improves exercise intolerance and QOL in patients with HF with reduced ejection fraction ( EF ) . However , the effect of ET in HFPEF has not been examined in a r and omized controlled trial . Methods and Results —This 16-week investigation was a r and omized , attention-controlled , single-blind study of medically supervised ET ( 3 days per week ) on exercise intolerance and QOL in 53 elderly patients ( mean age , 70±6 years ; range , 60 to 82 years ; women , 46 ) with isolated HFPEF ( EF ≥50 % and no significant coronary , valvular , or pulmonary disease ) . Attention controls received biweekly follow-up telephone calls . Forty-six patients completed the study ( 24 ET , 22 controls ) . Attendance at exercise sessions in the ET group was excellent ( 88 % ; range , 64 % to 100 % ) . There were no trial-related adverse events . The primary outcome of peak exercise oxygen uptake increased significantly in the ET group compared to the control group ( 13.8±2.5 to 16.1±2.6 mL/kg per minute [ change , 2.3±2.2 mL/kg per minute ] versus 12.8±2.6 to 12.5±3.4 mL/kg per minute [ change , −0.3±2.1 mL/kg per minute ] ; P=0.0002 ) . There were significant improvements in peak power output , exercise time , 6-minute walk distance , and ventilatory anaerobic threshold ( all P<0.002 ) . There was improvement in the physical QOL score ( P=0.03 ) but not in the total score ( P=0.11 ) . Conclusions —ET improves peak and submaximal exercise capacity in older patients with HFPEF . Clinical Trial Registration —URL : http://www . clinical trials.gov . Unique identifier : NCT01113840 BACKGROUND The health benefits of leisure-time physical activity are well known , but whether less exercise than the recommended 150 min a week can have life expectancy benefits is unclear . We assessed the health benefits of a range of volumes of physical activity in a Taiwanese population . METHODS In this prospect i ve cohort study , 416,175 individuals ( 199,265 men and 216,910 women ) participated in a st and ard medical screening programme in Taiwan between 1996 and 2008 , with an average follow-up of 8·05 years ( SD 4·21 ) . On the basis of the amount of weekly exercise indicated in a self-administered question naire , participants were placed into one of five categories of exercise volumes : inactive , or low , medium , high , or very high activity . We calculated hazard ratios ( HR ) for mortality risks for every group compared with the inactive group , and calculated life expectancy for every group . FINDINGS Compared with individuals in the inactive group , those in the low-volume activity group , who exercised for an average of 92 min per week ( 95 % CI 71 - 112 ) or 15 min a day ( SD 1·8 ) , had a 14 % reduced risk of all-cause mortality ( 0·86 , 0·81 - 0·91 ) , and had a 3 year longer life expectancy . Every additional 15 min of daily exercise beyond the minimum amount of 15 min a day further reduced all-cause mortality by 4 % ( 95 % CI 2·5 - 7·0 ) and all-cancer mortality by 1 % ( 0·3 - 4·5 ) . These benefits were applicable to all age groups and both sexes , and to those with cardiovascular disease risks . Individuals who were inactive had a 17 % ( HR 1·17 , 95 % CI 1·10 - 1·24 ) increased risk of mortality compared with individuals in the low-volume group . INTERPRETATION 15 min a day or 90 min a week of moderate-intensity exercise might be of benefit , even for individuals at risk of cardiovascular disease . FUNDING Taiwan Department of Health Clinical Trial and Research Center of Excellence and National Health Research Institutes Exercise training improves functional capacity in patients with exercise limitation attributed to systolic dysfunction ( SD ) , but exercise training effects in patients with diastolic dysfunction is unclear . The authors determined the functional capacity , quality of life , and echocardiography responses of heart failure with preserved ejection fraction ( HFpEF ) patients to 16 weeks exercise training . Thirty patients with HFpEF were r and omized to an exercise training or non-exercising control group . The patients had a baseline mean age of 64 ± 8 years , left ventricular ejection fraction 57 % ± 10 % , and peak oxygen consumption ( peak VO(2 ) ) of 13.3 ± 3.8 mL O(2 ) /kg/min . Minnesota Living With Heart Failure and Hare-Davis scores and echocardiographic measures ( ejection fraction , systolic and diastolic tissue velocity and filling pressure [ E/E ' ] ) were performed at baseline and after 16 weeks of exercise training . The exercise training and non-exercising control groups showed similar baseline VO(2 ) ( 12.2 ± 3.6 mL/kg/min vs 14.1 ± 4.1 mL/kg/min ) , ejection fraction ( 58 % ± 13 % vs 57 % ± 8 % ) , and systolic and diastolic function . After exercise training the increment in peak VO(2 ) in the exercise training group was ( 24.6 % , P=.02 ) , and the non-exercising control group ( 5.1 % , P=.19 ) . V(E ) /VCO(2 ) slope was reduced by 12.7 % in the exercise training group ( P=.02 ) but was unchanged in the non-exercising control group ( P=.03 ) . No significant changes in diastolic or systolic function were noted in either group . Quality -of-life and depression scores were unchanged with exercise training . Changes in peak VO(2 ) and V(E ) /VCO(2 ) slope were unrelated to measures of diastolic and systolic function . In patients with exercise limitation attributed to HFpEF , the improvement in peak VO(2 ) with exercise training was not clearly related to changes in cardiac function OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity Heart failure with preserved ejection fraction ( HFpEF ) is a major cause of morbidity and mortality . Exercise training is an established adjuvant therapy in heart failure ; however , the effects of high-intensity interval training ( HIIT ) in HFpEF are unknown . We compared the effects of HIIT vs. moderate-intensity aerobic continuous training ( MI-ACT ) on peak oxygen uptake ( V̇o₂peak ) , left ventricular diastolic dysfunction , and endothelial function in patients with HFpEF . Nineteen patients with HFpEF ( age 70 ± 8.3 yr ) were r and omized to either HIIT ( 4 × 4 min at 85 - 90 % peak heart rate , with 3 min active recovery ) or MI-ACT ( 30 min at 70 % peak heart rate ) . Fifteen patients completed exercise training ( HIIT : n = 9 ; MI-ACT : n = 6 ) . Patients trained 3 days/wk for 4 wk . Before and after training patients underwent a treadmill test for V̇o₂peak determination , 2D-echocardiography for assessment of left ventricular diastolic dysfunction , and brachial artery flow-mediated dilation ( FMD ) for assessment of endothelial function . HIIT improved V̇o₂peak ( pre = 19.2 ± 5.2 ml·kg(-1)·min(-1 ) ; post = 21.0 ± 5.2 ml·kg(-1)·min(-1 ) ; P = 0.04 ) and left ventricular diastolic dysfunction grade ( pre = 2.1 ± 0.3 ; post = 1.3 ± 0.7 ; P = 0.02 ) , but FMD was unchanged ( pre = 6.9 ± 3.7 % ; post = 7.0 ± 4.2 % ) . No changes were observed following MI-ACT . A trend for reduced left atrial volume index was observed following HIIT compared with MI-ACT ( -3.3 ± 6.6 vs. + 5.8 ± 10.7 ml/m(2 ) ; P = 0.06 ) . In HFpEF patients 4 wk of HIIT significantly improved V̇o₂peak and left ventricular diastolic dysfunction . HIIT may provide a more robust stimulus than MI-ACT for early exercise training adaptations in HFpEF OBJECTIVE To evaluate the change in the 6-minute walk test ( 6-MWT ) distance relative to changes in key functional capacity measures after 16 weeks of exercise training in older patients ( ≥65y ) who have heart failure with preserved ejection fraction ( HFpEF ) . DESIGN Prospect i ve , r and omized , single-blinded ( by research ers to patient group ) comparison of 2 groups of HFpEF patients . SETTING Hospital and clinic records ; ambulatory out patients . PARTICIPANTS Participants ( N=47 ) r and omly assigned to an attention control ( AC ) ( n=24 ) or exercise training ( ET ) ( n=23 ) group . INTERVENTION The ET group performed cycling and walking at 50 % to 70 % of peak oxygen uptake ( V˙o2peak ) intensity ( 3d/wk , 60min each session ) . MAIN OUTCOME MEASURES V˙o2peak , ventilatory threshold ( VT ) , and 6-MWT distance were measured at baseline and after the 16-week study period . RESULTS At follow-up , the 6-MWT distance was higher than at the baseline in both the ET ( 11 % , P=.005 ) and AC ( 9 % , P=.004 ) groups . In contrast , V˙o2peak and VT values increased in the ET group ( 19 % and 11 % , respectively ; P=.001 ) , but decreased in the AC group at follow-up ( 2 % and 0 % , respectively ) . The change in V˙o2peak versus 6-MWT distance after training was also not significantly correlated in the AC group ( r=.01 , P=.95 ) or in the ET group ( r=.13 , P=.57 ) . The change in 6-MWT distance and VT ( an objective submaximal exercise measure ) was also not significantly correlated in the AC group ( r=.08 , P=.74 ) or in the ET group ( r=.16 , P=.50 ) . CONCLUSIONS The results of this study challenge the validity of using the 6-MWT as a serial measure of exercise tolerance in elderly HFpEF patients and suggest that submaximal and peak exercise should be determined objective ly by VT and V˙o2peak in this patient population Reduced heart rate variability ( HRV ) in older patients with heart failure ( HF ) is common and indicates poor prognosis . Exercise training ( ET ) has been shown to improve HRV in younger patients with HF . However , the effect of ET on HRV in older patients with HF is not known . Sixty-six participants ( 36 % men ) , aged 69±5 years , with HF and both preserved ejection fraction ( HFPEF ) and reduced ejection fraction ( HFREF ) , were r and omly assigned to 16 weeks of supervised ET ( ET group ) vs attention-control ( AC group ) . Two HRV parameters ( the st and ard deviation of all normal RR intervals [ SDNN ] and the root mean square of successive differences in normal RR intervals [ RMSSD ] ) were measured at baseline and after completion of the study . When compared with the AC group , the ET group had a significantly greater increase in both SDNN ( 15.46±5.02 ms in ET vs 2.37±2.13 ms in AC , P=.016 ) and RMSSD ( 17.53±7.83 ms in ET vs 1.69±2.63 ms in AC , P=.003 ) . This increase was seen in both sexes and HF categories . ET improved HRV in older patients with both HFREF and HFPEF IMPORTANCE More than 80 % of patients with heart failure with preserved ejection fraction ( HFPEF ) , the most common form of heart failure among older persons , are overweight or obese . Exercise intolerance is the primary symptom of chronic HFPEF and a major determinant of reduced quality of life ( QOL ) . OBJECTIVE To determine whether caloric restriction ( diet ) or aerobic exercise training ( exercise ) improves exercise capacity and QOL in obese older patients with HFPEF . DESIGN , SETTING , AND PARTICIPANTS R and omized , attention-controlled , 2 × 2 factorial trial conducted from February 2009 through November 2014 in an urban academic medical center . Of 577 initially screened participants , 100 older obese participants ( mean [ SD ] : age , 67 years [ 5 ] ; body mass index , 39.3 [ 5.6 ] ) with chronic , stable HFPEF were enrolled ( 366 excluded by inclusion and exclusion criteria , 31 for other reasons , and 80 declined participation ) . INTERVENTIONS Twenty weeks of diet , exercise , or both ; attention control consisted of telephone calls every 2 weeks . MAIN OUTCOMES AND MEASURES Exercise capacity measured as peak oxygen consumption ( V̇O2 , mL/kg/min ; co- primary outcome ) and QOL measured by the Minnesota Living with Heart Failure ( MLHF ) Question naire ( score range : 0 - 105 , higher scores indicate worse heart failure-related QOL ; co- primary outcome ) . RESULTS Of the 100 enrolled participants , 26 participants were r and omized to exercise ; 24 to diet ; 25 to exercise + diet ; 25 to control . Of these , 92 participants completed the trial . Exercise attendance was 84 % ( SD , 14 % ) and diet adherence was 99 % ( SD , 1 % ) . By main effects analysis , peak V̇O2 was increased significantly by both interventions : exercise , 1.2 mL/kg body mass/min ( 95 % CI , 0.7 to 1.7 ) , P < .001 ; diet , 1.3 mL/kg body mass/min ( 95 % CI , 0.8 to 1.8 ) , P < .001 . The combination of exercise + diet was additive ( complementary ) for peak V̇O2 ( joint effect , 2.5 mL/kg/min ) . There was no statistically significant change in MLHF total score with exercise and with diet ( main effect : exercise , -1 unit [ 95 % CI , -8 to 5 ] , P = .70 ; diet , -6 units [ 95 % CI , -12 to 1 ] , P = .08 ) . The change in peak V̇O2 was positively correlated with the change in percent lean body mass ( r = 0.32 ; P = .003 ) and the change in thigh muscle : intermuscular fat ratio ( r = 0.27 ; P = .02 ) . There were no study -related serious adverse events . Body weight decreased by 7 % ( 7 kg [ SD , 1 ] ) in the diet group , 3 % ( 4 kg [ SD , 1 ] ) in the exercise group , 10 % ( 11 kg [ SD , 1 ] in the exercise + diet group , and 1 % ( 1 kg [ SD , 1 ] ) in the control group . CONCLUSIONS AND RELEVANCE Among obese older patients with clinical ly stable HFPEF , caloric restriction or aerobic exercise training increased peak V̇O2 , and the effects may be additive . Neither intervention had a significant effect on quality of life as measured by the MLHF Question naire . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00959660 OBJECTIVES The study sought to evaluate the effects of endurance exercise training ( ET ) on endothelial-dependent flow-mediated arterial dilation ( FMD ) and carotid artery stiffness , and their potential contributions to the training-related increase in peak exercise oxygen consumption ( Vo2 ) in older patients with heart failure with preserved ejection fraction ( HFPEF ) . BACKGROUND Elderly HFPEF patients have severely reduced peak Vo2 , which improves with ET , however , the mechanisms of this improvement are unclear . FMD and arterial distensibility are critical components of the exercise response and are reduced with aging . However , it is unknown whether these improve with ET in elderly HFPEF or contribute to the training-related improvement in peak Vo2 . METHODS A total of 63 HFPEF patients ( age 70 ± 7 years ) were r and omized to 16 weeks of ET ( walking , arm and leg ergometry , n = 32 ) or attention control ( CT ) ( n = 31 ) . Peak Vo2 , brachial artery FMD in response to cuff ischemia , carotid artery distensibility by high-resolution ultrasound , left ventricular function , and quality of life were measured at baseline and follow-up . RESULTS ET increased peak Vo2 ( ET : 15.8 ± 3.3 ml/kg/min vs. CT : 13.8 ± 3.1 ml/kg/min , p = 0.0001 ) and quality of life . However , brachial artery FMD ( ET : 3.8 ± 3.0 % vs. CT : 4.3 ± 3.5 % , p = 0.88 ) , and carotid arterial distensibility ( ET : 0.97 ± 0.56 vs. CT : 1.07 ± 0.34 × 10(-3 ) mm·mm Hg(-2 ) ; p = 0.65 ) were unchanged . Resting left ventricular systolic and diastolic function were unchanged by ET . CONCLUSIONS In elderly HFPEF patients , 16 weeks of ET improved peak Vo2 without altering endothelial function or arterial stiffness . This suggests that other mechanisms , such as enhanced skeletal muscle perfusion and /or oxygen utilization , may be responsible for the ET-mediated increase in peak Vo2 in older HFPEF patients . ( Prospect i ve Aerobic Reconditioning Intervention Study [ PARIS ] ; NCT01113840 ) Background Despite suffering from poor prognosis , progressive exercise intolerance , and impaired quality of life ( QoL ) , effective therapeutic strategies in heart failure with preserved ejection fraction ( HFpEF ) are sparse . Exercise training ( ET ) improves physical QoL in HFpEF , but the effects on other aspects of QoL are unknown . Methods The multicentre , prospect i ve , r and omized , controlled Exercise training in Diastolic Heart Failure Pilot study included 64 HFpEF patients ( 65 ± 7 years , 56 % female ) . They were r and omized to supervised endurance/resistance training in addition to usual care ( ET , n = 44 ) or usual care alone ( UC , n = 20 ) . At baseline and after 3 months , QoL was assessed ( 36-item Short-form Health Survey ( SF-36 ) , Minnesota Living With Heart Failure Question naire ( MLWHFQ ) , and Patient Health Question naire ( PHQ-9 ) . Results Exercise improved the following SF-36 dimensions : physical functioning ( p < 0.001 , p = 0.001 vs. UC ) , bodily pain ( p = 0.046 ) , general health perception ( p < 0.001 , p = 0.016 vs. UC ) , general mental health ( p = 0.002 ) , vitality ( p = 0.003 ) , social functioning ( p < 0.001 ) physical ( p < 0.001 , p = 0.001 vs. UC ) , and mental component score ( p = 0.030 ) . ET did not improve role limitations due to physical and emotional problems . The MLWHFQ total scale ( p < 0.001 ) and the MLWHFQ physical limitation scale ( p < 0.001 , p = 0.04 vs. UC ) also improved with ET . The MLWHFQ emotional limitation scale did not change with ET . With ET , also the PHQ-9 total score improved significantly ( p = 0.004 , p = 0.735 vs. UC ) . Conclusions In patients with HFpEF , exercise training improved emotional status , physical and social dimensions of QoL as well as symptoms of depression from pre to post test . Physical dimensions of QoL and general health perception also improved significantly with exercise in comparison to usual care
11,042
23,235,602
There was no difference in intraoperative blood loss nor the volumes of intraoperative red cell or fresh frozen plasma transfused between groups . The non-buffered fluid group also had significantly greater base deficit , serum sodium and chloride levels . There was no difference demonstrated in length of hospital stay and no data were reported on cost or quality of life . AUTHORS ' CONCLUSIONS The administration of buffered fluids to adult patients during surgery is equally safe and effective as the administration of non-buffered saline-based fluids . The use of buffered fluids is associated with less metabolic derangement , in particular hyperchloraemia and metabolic acidosis .
BACKGROUND Perioperative fluid therapy influences clinical outcomes following major surgery . Fluid preparations may be based on a simple non-buffered salt solution , such as normal saline , or may be modified with bicarbonate or bicarbonate precursor buffers , such as maleate , gluconate , lactate or acetate , to better reflect the human physiological state . These latter fluids have theoretical advantages over normal saline in preventing hyperchloraemic acidosis . A number of clinical studies have now compared fluid preparations with and without a buffer to achieve a balanced electrolyte solution for perioperative fluid resuscitation . OBJECTIVES To review the safety and efficacy of perioperative administration of buffered versus non-buffered fluids for plasma volume expansion or maintenance in adult patients undergoing surgery .
A prospect i ve r and omized double-blind study was performed to determine the effects of three colloids , Haemaccel , Gelofusine and albumin , and also saline on platelet activation , platelet aggregation ( induced by adenosine diphosphate ( ADP ) , epinephrine , collagen ) platelet agglutination by ristocetin and other hemostatic variables in 55 patients undergoing primary unilateral total hip replacement . The fluids were administered according to normal clinical practice and assessment s were made immediately before , at the end , and 2 h after the end of surgery . Surgery was accompanied by thrombin generation ( increases in thrombin/antithrombin III complex , prothrombin F1 + 2 fragment ) platelet activation ( betaTG ) and compromised coagulation . Generally , the platelet activation appeared to result in platelet desensitization and brought about a persistent reduction in platelet aggregation to ADP and epinephrine , irrespective of the fluid used . Additionally , Haemaccel and Gelofusine inhibited ristocetin-induced platelet agglutination and albumin inhibited collagen-induced platelet aggregation . Gross inhibitory effects of Haemaccel that had been predicted from an earlier in vitro study did not occur . Particular fluids had selective additional effects on the hemostatic system . Albumin infusion served to maintain plasma albumin at normal concentrations postsurgery . The two gelatin preparations , Haemaccel and Gelofusine , maintained plasma viscosity . All three colloids led to a transient increase in activated partial thromboplastin time postsurgery and also a transient fall in the concentration of factor VIII , which were accompanied by a transient increase in bleeding time , but there was no measurable increase in blood loss . Inhibition of platelet aggregation by certain colloids may provide additional protection against the increased thrombotic risk in patients following major surgery Metabolic acidosis and changes in serum osmolarity are consequences of 0.9 % normal saline ( NS ) solution administration . We sought to determine if these physiologic changes influence patient outcome . Patients undergoing aortic reconstructive surgery were enrolled and were r and omly assigned to receive lactated Ringer ’s ( LR ) solution ( n = 33 ) or NS ( n = 33 ) in a double-blinded fashion . Anesthetic and fluid management were st and ardized . Multiple measures of outcome were monitored . The NS patients developed a hyperchloremic acidosis and received more bicarbonate therapy ( 30 ± 62 mL in the NS group versus 4 ± 16 mL in the LR group ; mean ± sd ) , which was given if the base deficit was greater than −5 mEq/L. The NS patients also received a larger volume of platelet transfusion ( 478 ± 302 mL in the NS group versus 223 ± 24 mL in the LR group ; mean ± sd ) . When all blood products were summed , the NS group received significantly more blood products ( P = 0.02 ) . There were no differences in duration of mechanical ventilation , intensive care unit stay , hospital stay , and incidence of complications . When NS was used as the primary intraoperative solution , significantly more acidosis was seen on completion of surgery . This acidosis result ed in no apparent change in outcome but required larger amounts of bicarbonate to achieve predetermined measurements of base deficit and was associated with the use of larger amounts of blood products . These changes should be considered when choosing fluids for surgical procedures involving extensive blood loss and requiring extensive fluid administration Study Design . Prospect i ve , r and omized , double blind , clinical study . Objective . To compare the hemostatic and electrolyte effects of 2 commonly administered hydroxyethyl starches ( HES ) in patients undergoing posterior lumbar interbody fusion ( PLIF ) . Summary of Background Data . HES are commonly administered colloid solutions to restore and maintain intravascular volume before transfusion is initiated . However , infusion of HES itself can impair coagulation . HES-induced coagulopathy could be a serious problem in PLIF which involves continuous bone bleeding and oozing . Voluven ( Fresenius Kabi , Germany ) , previously regarded as the least coagulopathic due to its low molecular weight ( MW ) and degree of substitution ( DS ) , is a saline-based HES . Hextend ( Biotime , United States ) is a new type of HES with physiologic pH and balanced electrolytes , including calcium , which is beneficial to coagulation . Studies comparing the coagulopathy of Hextend and Voluven are rare . Therefore , coagulation , pH/electrolyte changes , and blood loss using Hextend and Voluven in patients undergoing PLIF were compared . Methods . Fifty-four patients scheduled for PLIF involving 3 vertebrae or less were r and omly assigned to the Voluven or the Hextend group . Of each solution 15 mL/kg was administered during surgery . Blood loss , coagulation , and electrolyte profiles were checked before infusion and 5 minutes , 3 hours , and 24 hours after the end of infusion . Results . The Hextend group showed slightly better electrolyte balance , however , more coagulation impairment and postoperative transfusion ( 37 % vs. 11 % ) compared with the Voluven group . The effect of Hextend on coagulation lasted until 24 hours after infusion . Conclusion . If coagulopathy is a concern during PLIF , then , a HES with low MW/DS in a saline-based medium ( Voluven ) may be a better alternative than a HES with high MW/DS in a balanced salt medium ( Hextend ) OBJECTIVES To compare the effects of lactated Ringer 's solution ( LR ) , 6 % hetastarch in a balanced-saline vehicle ( HS-BS ) , and 6 % hetastarch in normal saline ( HS-NS ) on coagulation using thromboelastography . DESIGN Prospect i ve , r and omized double-blinded evaluation of previously published clinical trial . SETTING Tertiary-care medical center . PARTICIPANTS Patients undergoing elective noncardiac surgery with an anticipated blood loss > 500 mL. A total of 90 patients were enrolled with 30 patients in each group . INTERVENTIONS Patients received a st and ardized anesthetic . LR , HS-BS , and HS-NS were administered intraoperatively based on a fluid administration algorithm . Hemodynamic targets included maintenance of arterial blood pressure , heart rate , and urine output within a predefined range . MEASUREMENTS AND MAIN RESULTS Thromboelastography variables for r time , k time , maximum amplitude , and alpha angle ( mean + /- SD ) were recorded at induction of anesthesia , at the end of surgery , and 24 hours postoperatively . Patients in the LR group showed a state of hypercoagulation at the end of surgery with reductions ( p < 0.005 ) in r time ( -3.8 + /- 6.7 mm ) and k time ( -1.7 + /- 2.5 mm ) . This state of hypercoagulation continued into the postoperative period . Patients in the HS-NS group showed a state of hypocoagulation with increases ( p < 0.05 ) in r time ( + 6.2 + /- 8.5 mm ) and k time ( + 1.7 + /- 3.9 mm ) and a reduction in maximum amplitude ( -8.0 + /- 9.8 mm ) at the end of surgery . This state of hypocoagulation was reduced in the postoperative period . Patients in the HS-BS group showed no significant changes in coagulation status at end of surgery , with the smallest changes in r time ( -0.3 + /- 4.1 mm ) , k time ( + 0.1 + /- 3.1 mm ) , maximum amplitude ( -5.4 + /- 12.3 mm ) , and alpha angle ( 0.3 + /- 12.5 degrees ) . CONCLUSION LR-treated patients exhibited a hypercoagulative profile that persisted into the postoperative period . HS-BS administration was associated with a lesser change in the coagulation profile compared with HS-NS , which was associated with a hypocoagulative state Normal saline ( NS ; 0.9 % NaCl ) is administered during kidney transplantation to avoid the risk of hyperkalemia associated with potassium-containing fluids . Recent evidence suggests that NS may be associated with adverse effects that are not seen with balanced-salt fluids , e.g. , lactated Ringer ’s solution ( LR ) . We hypothesized that NS is detrimental to renal function in kidney transplant recipients . Adults undergoing kidney transplantation were enrolled in a prospect i ve , r and omized , double-blind clinical trial of NS versus LR for intraoperative IV fluid therapy . The primary outcome measure was creatinine concentration on postoperative Day 3 . The study was terminated for safety reasons after interim analysis of data from 51 patients . Forty-eight patients underwent living donor kidney transplants , and three patients underwent cadaveric donor transplants . Twenty-six patients received NS , and 25 patients received LR . There was no difference between groups in the primary outcome measure . Five ( 19 % ) patients in the NS group versus zero ( 0 % ) patients in the LR group had potassium concentrations > 6 mEq/L and were treated for hyperkalemia ( P = 0.05 ) . Eight ( 31 % ) patients in the NS group versus zero ( 0 % ) patients in the LR group were treated for metabolic acidosis ( P = 0.004 ) . NS did not adversely affect renal function . LR was associated with less hyperkalemia and acidosis compared with NS . LR may be a safe choice for IV fluid therapy in patients undergoing kidney transplantation Purpose A balanced fluid replacement strategy appears to be promising for correcting hypovolemia . The benefits of a balanced fluid replacement regimen were studied in elderly cardiac surgery patients . Methods In a r and omized clinical trial , 50 patients aged > 75 years undergoing cardiac surgery received a balanced 6 % HES 130/0.42 plus a balanced crystalloid solution ( n = 25 ) or a non-balanced HES in saline plus saline solution ( n = 25 ) to keep pulmonary capillary wedge pressure/ central venous pressure between 12–14 mmHg . Acid-base status , inflammation , endothelial activation ( soluble intercellular adhesion molecule-1 , kidney integrity ( kidney-specific proteins glutathione transferase-alpha ; neutrophil gelatinase-associated lipocalin ) were studied after induction of anesthesia , 5 h after surgery , 1 and 2 days thereafter . Serum creatinine ( sCr ) was measured approximately 60 days after discharge . Results A total of 2,750 ± 640 mL of balanced and 2,820 ± 550 mL of unbalanced HES were given until the second POD . Base excess ( BE ) was significantly reduced in the unbalanced ( from + 1.21 ± 0.3 to −4.39 ± 1.0 mmol L−1 5 h after surgery ; P < 0.001 ) and remained unchanged in the balanced group ( from 1.04 ± 0.3 to −0.81 ± 0.3 mmol L−1 5 h after surgery ) . Evolution of the BE was significantly different . Inflammatory response and endothelial activation were significantly less pronounced in the balanced than the unbalanced group . Concentrations of kidney-specific proteins after surgery indicated less alterations of kidney integrity in the balanced than in the unbalanced group . Conclusions A total balanced volume replacement strategy including a balanced HES and a balanced crystalloid solution result ed in moderate beneficial effects on acid-base status , inflammation , endothelial activation , and kidney integrity compared to a conventional unbalanced volume replacement regimen The IV administration of sodium chloride solutions may produce a metabolic acidosis and gastrointestinal dysfunction . We design ed this trial to determine whether , in elderly surgical patients , crystalloid and colloid solutions with a more physiologically balanced electrolyte formulation , such as Hartmann ’s solution and Hextend ® , can provide a superior metabolic environment and improved indices of organ perfusion when compared with saline-based fluids . Forty-seven elderly patients undergoing major surgery were r and omly allocated to one of two study groups . Patients in the Balanced Fluid group received an intraoperative fluid regimen that consisted of Hartmann ’s solution and 6 % hetastarch in balanced electrolyte and glucose injection ( Hextend ) . Patients in the Saline group were given 0.9 % sodium chloride solution and 6 % hetastarch in 0.9 % sodium chloride solution ( Hespan ® ) . Biochemical indices and acid-base balance were determined . Gastric tonometry was used as a reflection of splanchnic perfusion . Postoperative chloride levels demonstrated a larger increase in the Saline group than the Balanced Fluid group ( 9.8 vs 3.3 mmol/L , P = 0.0001 ) . Postoperative st and ard base excess showed a larger decline in the Saline group than the Balanced Fluid group ( −5.5 vs −0.9 mmol/L , P = 0.0001 ) . Two-thirds of patients in the Saline group , but none in the Balanced Fluid group , developed postoperative hyperchloremic metabolic acidosis ( P = 0.0001 ) . Gastric tonometry indicated a larger increase in the CO2 gap during surgery in the Saline group compared with the Balanced Fluid group ( 1.7 vs 0.9 kPa , P = 0.0394 ) . In this study , the use of balanced crystalloid and colloid solutions in elderly surgical patients prevented the development of hyperchloremic metabolic acidosis and result ed in improved gastric mucosal perfusion when compared with saline-based solutions OBJECTIVE The infusion of large amounts of saline-based solutions may contribute to the development of hyperchloremic metabolic acidosis and the use of a balanced carrier for colloid solutions might improve postoperative acid-base status . The equivalence of 2 hydroxyethyl starch ( HES ) solutions and the influence on chloride levels and acid-base status by selectively changing the carrier of rapidly degradable modern 6 % HES 130/0.4 were studied in cardiac surgery patients . DESIGN A prospect i ve , r and omized , double-blinded study . SETTING A clinical study in 2 cardiac surgery institutions . PARTICIPANTS Eighty-one patients . INTERVENTION Patients received either 6 % HES130/0.4 balanced ( Volulyte ; Fresenius Kabi , Bad Homburg , Germany ) or 6 % HES130/0.4 saline ( Voluven ; Fresenius Kabi , Bad Homburg , Germany ) for intra- and postoperative hemodynamic stabilization . MEASUREMENTS AND MAIN RESULTS The therapeutic equivalence of both HES formulations regarding volume effect and superiority of the balanced electrolyte solution regarding serum chloride levels and acid-base status were measured . Similar volumes of both HES 130/0.4 balanced and HES 130/0.4 saline were administered until 6 hours after surgery , 2,391 ± 518 mL in the HES 130/0.4 balanced group versus 2,241 ± 512 mL in the HES 130/0.4 saline group . The 95 % confidence interval for the difference between treatments ( -77 ; 377 mL ; mean , 150 mL ) was contained entirely in the predefined interval ( -500 , 500 mL ) , thereby proving equivalence . The serum chloride level ( mmol/L ) was lower ( p < 0.05 at the end of surgery ) , and arterial pH was higher in the balanced group at all time points except baseline , and base excess was less negative at all time points after baseline ( p < 0.01 ) . CONCLUSIONS Volumes of HES needed for hemodynamic stabilization were equivalent between treatment groups . Significantly lower serum chloride levels in the HES balanced group reflected the lower chloride load of similar infusion volumes . The HES balanced group had significantly less acidosis BACKGROUND : This study aim ed to quantify changes in acid-base balance , potassium and lactate levels as a function of administration of different crystalloid solutions during kidney transplantation , and to determine the ideal fluid for such patients . METHODS : In this double-blind study , patients were r and omized to three groups ( n = 30 each ) to receive either normal saline , lactated Ringer 's , or Plasmalyte , all at 20–30 mL · kg−1 · h−1 . Arterial blood analyses were performed before induction of anesthesia , and at 30-min intervals during surgery , and total IV fluids recorded . Urine volume , serum creatinine and BUN , and creatinine clearance were recorded on postoperative days 1 , 2 , 3 , and 7 . RESULTS : There was a statistically significant decrease in pH ( 7.44 ± 0.50 vs 7.36 ± 0.05 ) , base excess ( 0.4 ± 3.1 vs –4.3 ± 2.1 ) , and a significant increase in serum chloride ( 104 ± 2 vs 125 ± 3 mM/L ) in patients receiving saline during surgery . Lactate levels increased significantly in patients who received Ringer 's lactate ( 0.48 ± 0.29 vs 1.95 ± 0.48 ) . No significant changes in acid-base measures or lactate levels occurred in patients who received Plasmalyte . Potassium levels were not significantly changed in any group . CONCLUSIONS : All three crystalloid solutions can be safely used during uncomplicated , short- duration renal transplants ; however , the best metabolic profile is maintained in patients who receive Plasmalyte BACKGROUND Changes in acid-base balance caused by infusion of a 0.9 % saline solution during anesthesia and surgery are poorly characterized . Therefore , the authors evaluated these phenomena in a dose-response study . METHODS Two groups of 12 patients each who were undergoing major intraabdominal gynecologic surgery were assigned r and omly to receive 0.9 % saline or lactated Ringer 's solution in a dosage of 30 ml x kg(-1 ) x h(-1 ) . The pH , arterial carbon dioxide tension , and serum concentrations of sodium , potassium , chloride , lactate , and total protein were measured in 30-min intervals . The serum bicarbonate concentration was calculated using the Henderson-Hasselbalch equation and also using the Stewart approach from the strong ion difference and the amount of weak plasma acid . The strong ion difference was calculated as serum sodium + serum potassium - serum chloride - serum lactate . The amount of weak plasma acid was calculated as the serum total protein concentration in g/dl x 2.43 . RESULTS Infusion of 0.9 % saline , but not lactated Ringer 's solution , caused a metabolic acidosis with hyperchloremia and a concomitant decrease in the strong ion difference . Calculating the serum bicarbonate concentration using the Henderson-Hasselbalch equation or the Stewart approach produced equivalent results . CONCLUSIONS Infusion of approximately 30 ml x kg(-1 ) x h(-1 ) saline during anesthesia and surgery inevitably leads to metabolic acidosis , which is not observed after administration of lactated Ringer 's solution . The acidosis is associated with hyperchloremia During an attempt to measure renal function during operation in six patients undergoing major abdominal surgery involving intestinal resection and blood loss in excess of 300 ml , it became apparent that the conventional recommendation for i.v . crystalloid fluid of 5 - 10 ml kg-1 h-1 was not sufficient to maintain cardiovascular stability and urine output , but a volume of 15 ml kg-1 h-1 , given to a subsequent six patients , was adequate . Administration of low sodium ( glucose ) solutions also produced biochemical abnormalities of a severity not documented previously . A survey of the published literature on volumes of crystalloid fluids used supports the contention that , during major surgery , crystalloid requirements may be of the order of 10 - 15 ml kg-1 h-1 rather than 5 - 10 ml kg-1 h-1 Using a new method of measuring whole blood clotting time , we have confirmed the findings of others , that blood clots faster when diluted with saline . A prospect i ve trial was design ed to test the hypothesis that intravenous saline peroperatively causes hypercoagulation and increases the risk of venous thrombosis . Sixty patients admitted for routine laparotomy were r and omly allocated to either a group receiving intravenous fluids during or after operation ( Wet ) , or a group receiving no intravenous fluids ( Dry ) . The Wet patients became significantly more haemodilute and hypercoagulable than the Dry ( P < 0·001 ) , and these changes correlated . A postoperative deep vein thrombosis occurred in 30 per cent of the Wet patients , but in only 7 per cent of the Dry ( P < 0·05 ) . The need for intravenous fluids during uncomplicated surgery is probably unproved , and their greater use may have contributed to the increasing prevalence of venous thromboembolism Background and objective : The kind of fluid for correcting hypovolaemia is still a focus of debate . In a prospect i ve , r and omized , controlled and double‐blind study in patients undergoing major abdominal surgery , a total balanced volume replacement strategy including a new balanced hydroxyethyl starch ( HES ) solution was compared with a conventional , non‐balanced fluid regimen . Methods : In Group A ( n = 15 ) , a new balanced 6 % HES 130/0.42 was given along with a balanced crystalloid solution ; in Group B ( n = 15 ) , an unbalanced conventional HES 130/0.42 plus an unbalanced crystalloid ( saline solution ) were administered . Volume was given when mean arterial pressure ( MAP ) was < 65 mmHg and central venous pressure ( CVP ) minus positive end‐expiratoric pressure ( PEEP ) level was < 10 mmHg . Haemodynamics , acid – base status , coagulation ( thrombelastography ( TEG ) ) and kidney function ( including kidney‐specific proteins , N‐acetyl‐beta‐d‐glucosaminidase ( beta‐NAG ) and alpha‐1‐microglobulin ) were measured after induction of anaesthesia , at the end of surgery , 5 and 24 h after surgery . Results : Group A received 3533 ± 1302 mL of HES and 5333 ± 1063 mL of crystalloids , in Group B , 3866 ± 1674 mL of HES and 5966 ± 1202 mL of crystalloids were given . Haemodynamics , laboratory data , TEG data and kidney function were without significant differences between the groups . Cl− concentration and base excess ( −5 ± 2.4 mmol L−1 vs. 0.4 ± 2.4 mmol L−1 ) were significantly higher in patients of Group B than of Group A. Conclusions : A complete balanced volume replacement strategy including a new balanced HES preparation result ed in significantly less derangement in acid – base status compared with a non‐balanced volume replacement regimen . The new HES preparation showed no negative effects on coagulation and kidney function In this study , we compared the effects of large intravascular volume infusion of 0.9 % saline ( NS ) or lactated Ringer ’s ( LR ) solution on electrolytes and acid base balance during major spine surgery and evaluated the postoperative effects . Thirty patients aged 18–70 yr were included in the study . General anesthesia was induced with 5 mg/kg thiopental and 0.1 mg/kg vecuronium IV . Anesthesia was maintained with oxygen in 70 % nitrous oxide and 1.5%–2 % sevoflurane . In Group I , the NS solution , and in Group II , the LR solution were infused 20 mL · kg−1 · h−1 during the operation and 2.5 mL · kg−1 · h−1 , postoperatively . Electrolytes ( Na+ , K+ , Cl− ) and arterial blood gases were measured preoperatively , every hour intraoperatively and at the 1st , 2nd , 4th , 6th , and 12th hours postoperatively . In the NS group , pHa , HCO3 and base excess decreased , and Cl− values increased significantly at the 2nd hour and Na+ values increased at the 4th hour intraoperatively ( P < 0.001 ) . The values returned to normal ranges at the 12th hour postoperatively . In the LR group , blood gas analysis and electrolyte values did not show any significant difference intraoperatively , but the increase in Paco2 and the decrease in pHa and serum Na+ was significant at the 1st hour postoperatively . Although intraoperative 20 mL · kg−1 · h−1 LR infusion does not cause hyperchloremic metabolic acidosis as does NS infusion , it leads to postoperative respiratory acidosis and mild hyponatremia RATIONALE Hyperchloremic acidosis is common in the critically ill and is often iatrogenic . We have previously shown that hyperchloremic acidosis increases nuclear factor-kappaB DNA binding in lipopolysaccharide-stimulated RAW 264.7 cells . However , evidence that hyperchloremic acidosis leads to increased inflammation in vivo has been limited to nitric oxide . OBJECTIVES To determine if acidosis , induced by dilute hydrochloric acid ( HCl ) infusion , will increase circulating inflammatory mediator levels in an experimental model of severe sepsis in rats . METHODS Eighteen hours after inducing lethal sepsis by cecal ligation and puncture in 20 adult , male , Sprague-Dawley rats , we r and omized animals into three groups . In groups 2 and 3 , we began an IV infusion of 0.1 N HCl to reduce the st and ard base excess ( SBE ) by 5 to 10 mEq/L and 10 to 15 mEq/L , respectively . In group 1 , we infused a similar volume of lactated Ringer solution . In all groups infusion continued 8 h or until the animal died . MEASUREMENTS AND MAIN RESULTS We measured arterial blood gases , whole-blood lactate , and chloride , tumor necrosis factor ( TNF ) , interleukin (IL)-6 , and IL-10 levels at 0 h , 4 h , and 8 h. All measured cytokines increased over time . Compared to group 1 , animals in groups 2 and 3 exhibited greater increase in all three cytokines , with the greatest increases seen with severe acidosis . CONCLUSION Moderate ( SBE , - 5 to - 10 ) and severe ( SBE , - 10 to - 15 ) acidosis , induced by HCl infusion , increases circulating levels of IL-6 , IL-10 , and TNF in normotensive septic rats BACKGROUND Hydroxyethyl starch ( HES ) may affect blood coagulation . We studied the effects of a modified , balanced , high-molecular weight [ mean molecular weight ( MW ) 550 kDa ] , high-substituted [ degree of substitution ( DS ) 0.7 ] HES preparation ( Hextend ) on coagulation in patients undergoing major abdominal surgery . METHODS Patients were allocated r and omly to receive Hextend ) ( n=21 ) , lactated Ringer 's solution ( RL , n=21 ) or 6 % HES with a low MW ( 130 kDa ) and a low DS ( 0.4 ) ( n=21 ) . The infusion was started after induction of anaesthesia and continued until the second postoperative day to maintain central venous pressure between 8 and 12 mm Hg . Activated thrombelastography ( TEG ) was used to assess coagulation . Different activators were used ( extrinsic and intrinsic activation of TEG ) and aprotinin was added to assess hyperfibrinolytic activity ( ApTEG ) . We measured onset of coagulation [ coagulation time ( CT = reaction time , r ) ] , the kinetics of clot formation [ clot formation time ( CFT = coagulation time , k ) ] and maximum clot firmness ( MCF = maximal amplitude , MA ) . Measurements were performed after induction of anaesthesia , at the end of surgery , 5 h after surgery and on the mornings of the first and second days after surgery . RESULTS Significantly more HES 130/0.4 [ 2590 ( SD 260 ) ml ] than Hextend ) [ 1970 ( 310 ) ml ] was given . Blood loss was greatest in the Hextend ) group and did not differ between RL- and HES 130/0.4-treated patients . Baseline TEG data were similar and within the normal range . CT and CFT were greater in the Hextend ) group immediately after surgery , 5 h after surgery and on the first day than in the two other groups . ApTEG MCF also changed significantly in the Hextend ) patients , indicating more pronounced fibrinolysis . Volume replacement using RL caused moderate hypercoagulability , shown by a decrease in CT . CONCLUSION A modified , balanced high-molecular weight HES with a high degree of substitution ( Hextend ) adversely affected measures of coagulation in patients undergoing major abdominal surgery , whereas a preparation with a low MW and low DS affected these measures of haemostasis less . Large amounts of RL decreased the coagulation time Intravenous fluid replacement in adult elective surgery is often initiated with dextrose-containing fluids . We sought to determine if this practice result ed in significant hyperglycaemia and if there was a risk of hypoglycaemia if non-dextrose-containing crystalloids were used instead . We conducted a r and omized controlled trial in 50 non-diabetic adult patients undergoing elective surgery which did not involve entry into major body cavities , large fluid shifts , or require administration of > 500 ml of intravenous fluid in the first two hours of peri-operative care . Patients received 500 ml of either 5 % dextrose in 0.9 % normal saline , lactated Ringer 's solution , or 0.9 % normal saline over 45 to 60 minutes . Plasma glucose , electrolytes and osmolality were measured prior to infusion , and at 15 minutes and one hour after completion of infusion . None of the patients had preoperative hypoglycaemia despite average fasting times of almost 13 hours . Patients receiving lactated Ringer 's and normal saline remained normoglycaemic throughout the study period . Patients receiving dextrose saline had significantly elevated plasma glucose 15 minutes after completion of infusion ( 11.1 ( 9.9–12.2 , 95 % CI ) mmol/l ) . Plasma glucose exceeded 10 mmol/l in 72 % of patients receiving dextrose saline . There was no significant difference in plasma glucose between the groups at one hour after infusion , but 33 % of patients receiving DS had plasma glucose ≥8 mmol/l . We conclude that initiation of intravenous fluid replacement with dextrose-containing solutions is not required to prevent hypoglycaemia in elective surgery . On the contrary , a relatively small volume of 500 ml causes significant , albeit transient , hyperglycaemia , even in non-diabetic patients The influence of four different kinds of intravascular volume replacement on platelet function was investigated in 60 patients undergoing elective aortocoronary bypass grafting using cardiopulmonary bypass ( CPB ) . In a r and omized sequence , high-molecular weight hydroxyethyl starch solution ( HMW-HES , mean molecular weight [ Mw ] 450,000 d ) , low-molecular weight HES ( LMW-HES , Mw 200,000 d ) , 3.5 % gelatin or 5 % albumin were infused preoperatively to double reduced filling pressure ( pulmonary capillary wedge pressure [ PCWP ] < 5 mm Hg ) . Fifteen untreated patients served as a control . Platelet function was assessed by aggregometry using turbidometric technique ( inductors : ADP , epinephrine , collagen ) . Maximum aggregation , maximum gradient of aggregation , and platelet volume were measured before , during , and after CPB until the first postoperative day . HMW-HES 840 + /- 90 mL , LMW 850 + /- 100 mL , gelatin 950 + /- 110 mL , and albumin 810 + /- 100 mL were given preoperatively . Maximum platelet aggregation ( ranging from -23 % to -44 % relative from baseline value ) and maximum gradient of platelet aggregation ( ranging from -26 % to -45 % relative from baseline values ) were reduced only in the HMW-HES patients . After CPB , aggregometry also was impaired most markedly in these patients . The other volume groups showed less reduction in platelet aggregation and were similar to the untreated control . On the first postoperative day , aggregation variables had returned almost to baseline in all patients . Platelet volume was the same among the groups within the investigation period . Postbypass blood loss was highest in the HMW-HES group ( 890 + /- 180 mL ) . There was significant ( P < 0.04 ) correlation in this group between blood loss and change in platelet aggregation . ( ABSTRACT TRUNCATED AT 250 WORDS OBJECTIVE Balanced fluids appear to be have advantages over unbalanced fluids for correcting hypovolemia . The effects of a new balanced hydroxyethyl starch ( HES ) were studied in cardiac surgery patients . DESIGN Prospect i ve , r and omized , unblinded study . SETTING Clinical study in a single cardiac surgery institution . PARTICIPANTS Sixty patients undergoing elective cardiac surgery with cardiopulmonary bypass . INTERVENTION Patients received either a balanced 6 % HES 130/0.4 plus a balanced crystalloid ( n = 30 ) or an unbalanced HES-in-saline plus saline ( n = 30 ) to keep cardiac index > 2.5 L/min/m(2 ) . MEASUREMENTS AND MAIN RESULTS Base excess ( BE ) , kidney function , inflammatory response ( interleukins-6 , -10 ) , endothelial activation ( intercellular adhesion molecule-1 [ ICAM ] ) , and coagulation ( thromboelastometry , whole blood aggregation ) were measured after induction of anesthesia , after surgery and 5 hours later , and at the 1st and 2nd postoperative days ; 2,950 + /- 530 mL of balanced and 3,050 + /- 560 mL of unbalanced HES were given . BE was reduced significantly in the unbalanced group ( from 1.11 + /- 0.71 mmol/L to -5.11 + /- 0.48 mmol/L after surgery ) and remained unchanged in the balanced group . Balanced volume replacement result ed in significantly lower IL-6 , IL-10 , and ICAM plasma concentrations and lower urine concentrations of kidney-specific proteins than in the unbalanced group . After surgery , thromboelastometry data and platelet function were changed significantly in both groups ; 5 hours thereafter they were significantly changed only in the unbalanced group . CONCLUSION A plasma-adapted HES preparation in addition to a balanced crystalloid result ed in significantly less decline in BE , less increase in concentrations of kidney-specific proteins , less inflammatory response and endothelial damage , and fewer changes in hemostasis compared with an unbalanced fluid strategy
11,043
28,192,790
Three trials in this review found no differences between an NSAID and
Low back pain is one of the most frequently encountered conditions in clinical practice ( 1 , 2 ) . The most commonly prescribed medications for low back pain are nonsteroidal anti-inflammatory drugs ( NSAIDs ) , skeletal muscle relaxants , antidepressants , and opioids ( 35 ) ; benzodiazepines , systemic corticosteroids , and antiseizure medications are also prescribed ( 3 ) . Patients often use over-the-counter acetaminophen and NSAIDs . A 2007 guideline ( 6 ) and associated systematic review ( 7 ) from the American College of Physicians ( ACP ) and American Pain Society ( APS ) found evidence to support the use of acetaminophen and NSAIDs as first-line pharmacologic options for low back pain ; secondary options were skeletal muscle relaxants , benzodiazepines , and antidepressants . New evidence and medications are now available . Here , we review the current evidence on benefits and harms of medications for low back pain . This article has been used by ACP to up date a clinical practice guideline , also in this issue . This article addresses the key question , what are the comparative benefits and harms of different systemic pharmacologic therapies for acute or chronic nonradicular low back pain , radicular low back pain , or spinal stenosis ?
Objectives : To test the effects of pregabalin on the induction of neurogenic claudication . Methods : This study was a r and omized , double-blind , active placebo-controlled , 2-period , crossover trial . Twenty-nine subjects were r and omized to receive pregabalin followed by active placebo ( i.e. , diphenhydramine ) or active placebo followed by pregabalin . Each treatment period lasted 10 days , including a 2-step titration . Periods were separated by a 10-day washout period , including a 3-day taper phase after the first period . The primary outcome variable was the time to first moderate pain symptom ( Numeric Rating Scale score ≥4 ) during a 15-minute treadmill test ( Tfirst ) . Secondary outcome measures included pain intensity at rest , pain intensity at the end of the treadmill test , distance walked , and vali date d self-report measures of pain and functional limitation including the Rol and -Morris Disability Question naire , modified Brief Pain Inventory – Short Form , Oswestry Disability Index , and Swiss Spinal Stenosis Question naire . Results : No significant difference was found between pregabalin and active placebo for the time to first moderate pain symptom ( difference in median Tfirst = −1.08 [ 95 % confidence interval −2.25 to 0.08 ] , p = 0.61 ) . In addition , none of the secondary outcome measures of pain or functional limitation were significantly improved by pregabalin compared with active placebo . Conclusions : Pregabalin was not more effective than active placebo in reducing painful symptoms or functional limitations in patients with neurogenic claudication associated with lumbar spinal stenosis . Classification of evidence : This study provides Class I evidence that for patients with neurogenic claudication , compared with diphenhydramine , pregabalin does not increase the time to moderate pain during a treadmill test Study Design . A prospect i ve , r and omized , single ( investigator ) blind , comparative efficacy trial was conducted . Objective . To compare the efficacy of continuous low-level heat wrap therapy ( 40 C , 8 hours/day ) with that of ibuprofen ( 1200 mg/day ) and acetaminophen ( 4000 mg/day ) in subjects with acute nonspecific low back pain . Summary of Background Data . The efficacy of topical heat methods , as compared with oral analgesic treatment of low back pain , has not been established . Methods . Subjects ( n = 371 ) were r and omly assigned to heat wrap ( n = 113 ) , acetaminophen ( n = 113 ) , or ibuprofen ( n = 106 ) for efficacy evaluation , or to oral placebo ( n = 20 ) or unheated back wrap ( n = 19 ) for blinding . Outcome measures included pain relief , muscle stiffness , lateral trunk flexibility , and disability . Efficacy was measured over two treatment days and two follow-up days . Results . Day 1 pain relief for the heat wrap ( mean , 2 ) was higher than for ibuprofen ( mean , 1.51;P = 0.0007 ) or acetaminophen ( mean , 1.32;P = 0.0001 ) . Extended mean pain relief ( Days 3 to 4 ) for the heat wrap ( mean , 2.61 ) also was higher than for ibuprofen ( mean , 1.68;P = 0.0001 ) or acetaminophen ( mean , 1.95;P = 0.0009 ) . Lateral trunk flexibility was improved with the heat wrap ( mean change , 4.28 cm ) during treatment ( P ≤ 0.009 vs acetaminophen [ mean change , 2.93 cm ] , P ≤ 0.001 vs ibuprofen [ mean change , 2.51 cm ] ) . The results were similar on Day 4 . Day 1 reduction in muscle stiffness with the heat wrap ( mean , 16.3 ) was greater than with acetaminophen ( mean , 10.5;P = 0.001 ) . Disability was reduced with the heat wrap ( mean , 4.9 ) , as compared with ibuprofen ( mean , 2.7;P = 0.01 ) and acetaminophen ( mean , 2.9;P = 0.0007 ) , on Day 4 . None of the adverse events were serious . The highest rate ( 10.4 % ) was reported in the ibuprofen group . Conclusion . Continuous low-level heat wrap therapy was superior to both acetaminophen and ibuprofen for treating low back pain A double-blind , 18-center , balanced trial of diflunisal vs. cyclobenzaprine HCl vs. these two drugs combined vs. placebo produced complete results from 175 patients . They had sought treatment at the cooperating centers for acute painful spasms of the back within a day or two of trauma or strain . Global results over the 7 to 10 days of observations revealed a clinical ly and statistically significant superiority of the combined therapy by Day 4 ( P=0.006 ) and almost all patients recovered within a week to 10 days . A combination therapy with an effective safe analgesic and a true muscle relaxant for less than a week appears to be an excellent relief measure for acute back problems Tizanidine and aceclofenac individually have shown efficacy in the treatment of low back pain . The efficacy and tolerability of the combination have not yet been established . The objective of the study was to evaluate the efficacy and safety of aceclofenac-tizanidine fixed dose combination against aceclofenac alone in patients with acute low back pain . This double-blind , double-dummy , r and omized , comparative , multicentric , parallel group study enrolled 197 patients of either sex in the age range of 18–70 years with acute low back pain . The patients were r and omized to receive either aceclofenac ( 100 mg)–tizanidine ( 2 mg ) b.i.d or aceclofenac ( 100 mg ) alone b.i.d for 7 days . The primary efficacy outcomes were pain intensity ( on movement , at rest and at night ; on VAS scale ) and pain relief ( on a 5-point verbal rating scale ) . The secondary efficacy outcomes measures included functional impairment ( modified Schober ’s test and lateral body bending test ) and patient ’s and investigator ’s global efficacy assessment . aceclofenac – tizanidine was significantly superior to aceclofenac for pain intensity ( on movement , at rest and at night ; P < 0.05 ) and pain relief ( P = 0.00 ) on days 3 and 7 . There was significant increase in spinal flexion in both the groups from baseline on days 3 and 7 with significant difference in favour of the combination group ( P < 0.05 ) . There were significantly more number of patients with excellent to good response for the aceclofenac – tizanidine treatment as compared to aceclofenac alone ( P = 0.00 ) . Both the treatments were well tolerated . In this study , aceclofenac – tizanidine combination was more effective than aceclofenac alone and had a favourable safety profile in the treatment of acute low back pain This large , open-label , r and omized , parallel-group , multicenter study compared two oral sustained-release opioids (SROs)--AVINZA ( A-MQD ) , morphine sulfate extended-release capsules given once a day , and OxyContin ( O-ER ) , oxycodone modified-release tablets given twice a day -- in SRO-naive subjects ages 30 to 70 with chronic , moderate to severe low back pain . Of the 392 subjects enrolled and r and omized , 266 ( 132 in the A-MQD group and 134 in the O-ER group ) completed the opioid dose titration phase and entered an eight-week evaluation phase . During the evaluation phase , A-MQD achieved significantly better pain control than O-ER , as demonstrated by a greater decrease from baseline in pain scores obtained four times daily during weeks one , four , and eight ( p = 0.002 ) . The number of breakthrough-pain rescue medication doses adjusted for the number of patient days was significantly lower in the A-MQD group ( p < 0.0001 ) . Better pain control with A-MQD was achieved with a significantly lower daily opioid dose than with O-ER ( mean 69.9 mg and 91 mg morphine equivalents , respectively ; p = 0.0125 ) . Quality of sleep was significantly better with A-MQD for the entire evaluation phase ( p = 0.0026 ) . The incidence and severity of elicited opioid side effects were similar in the two groups . This trial demonstrated that once-daily A-MQD provides consistent around-the-clock pain relief in patients with low back pain . In patients who completed opioid dose titration , A-MQD was significantly better than O-ER for reducing pain and improving sleep , while requiring a lower daily opioid dose & NA ; We evaluated the efficacy of pregabalin in patients with chronic lumbosacral radiculopathy . This r and omized , controlled , withdrawal trial included five phases : screening ( 4–18 days ) ; run‐in ( 4–10 days ) to screen out placebo responders ; single‐blind ( 28 days ) to identify pregabalin responders ; double‐blind to r and omize responders to pregabalin or placebo ( 35 days ) ; and final study medication taper ( 7 days ) . The primary endpoint was time to loss of response ( LOR ) during the double‐blind phase ( ≥1‐point increase in pain , discontinuation , or rescue‐medication use ) . In the single‐blind phase , 58 % of patients had ≥30 % pain reduction . In the double‐blind phase , pregabalin ( n = 110 ) and placebo ( n = 107 ) groups did not differ significantly in time to LOR . Adverse events caused the discontinuation of 9.9 % and 5.6 % of pregabalin‐treated and placebo‐treated patients , respectively . Most patients with chronic lumbosacral radiculopathy responded to pregabalin therapy ; however , time to LOR did not significantly differ between pregabalin and placebo . Considering the results of all phases of the study , it is difficult to draw definitive conclusions from it , suggesting a need for further work to underst and the clinical potential of pregabalin treatment for lumbosacral radiculopathy Background Corticoids have potent anti-inflammatory effects , which may help in relieving pain and dysfunction associated with lumbar canal stenosis . We assessed the effectiveness of a decreasing-dose regimen of oral corticoids in the treatment of lumbar canal stenosis in a prospect i ve , double-blind , r and omized , placebo-controlled trial . Results Sixty-one patients with lumbar canal stenosis ( 50–75 years ; canal area < 100 mm2 at L3/L4 , L4/L5 , and /or L5/S1on magnetic resonance imaging ; and claudication within 100 m were electronically r and omized to an oral corticoid group ( n = 31 ) or a placebo group ( n = 30 ) . The treatment group received 1 mg/kg of oral corticoids daily , with a dose reduction of one-third per week for 3 weeks . Patients and controls were assessed by the Short Form 36 Health Survey , Rol and –Morris Question naire , 6-min walk test , visual analog scale , and a Likert scale . All instruments showed similar outcomes for the corticoid and placebo groups ( P > 0.05 ) . Obese patients exhibited more severe symptoms compared with non-obese patients . L4/L5 stenosis was associated with more severe symptoms compared with stenosis at other levels . Conclusion The oral corticoid regimen used in this study was not effective in the treatment of lumbar canal stenosis Study Design . Two replicate , 4-week , r and omized , double-blind , placebo-controlled , trials of rofecoxib 25 and 50 mg versus placebo for chronic low back pain . Objectives . To determine the efficacy and safety of two doses of rofecoxib compared to placebo in the treatment of chronic low back pain . Summary of Background Data . Although nonsteroidal anti-inflammatory drugs are commonly prescribed for chronic low back pain , their efficacy is unproven and toxicity can be serious . These studies evaluated the efficacy and tolerability of rofecoxib , a selective COX-2 inhibitor , in the treatment of chronic low back pain . Methods . Patients with chronic low back pain were r and omized 1:1:1 to rofecoxib 25 mg , 50 mg , or placebo once daily . Primary endpoint : Low Back Pain Intensity . Secondary endpoints : Pain Bothersomeness , Global Assessment s of Response to Therapy , Global Assessment of Disease Status , Rol and -Morris Disability Question naire , SF-12 Health Survey , Use of Rescue Acetaminophen , and Discontinuations Due to Lack of Efficacy . Results . Combining both studies , 690 patients were r and omized to placebo ( N = 228 ) , rofecoxib 25 mg ( N = 233 ) , or rofecoxib 50 mg ( N = 229 ) . Mean ( ± SD ) age was 53.4 ( ± 12.9 ) years , pain duration 12.1 ( ± 11.8 ) years , 62.3 % female . Both rofecoxib groups improved significantly . Mean differences from placebo in pain intensity were −13.50 mm , −13.81 mm ( 25 , 50 mg doses ) respectively ( P < 0.001 ) . Both regimens were superior to placebo in eight of nine secondary endpoints . Fifty mg provided no advantage over 25 mg . Both rofecoxib regimens were well tolerated , although 25 mg had a slightly better safety profile . Conclusions . Rofecoxib significantly reduced chronic low back pain in adults and was well tolerated Abstract Objective : This multicenter , double-blind , placebo-controlled study using a r and omized withdrawal design evaluated the efficacy and safety of once-daily OROS hydromorphone ER in the treatment of opioid-tolerant patients with chronic moderate-to-severe low back pain ( LBP ) . Main outcome measures : The primary efficacy assessment was mean change in pain intensity based on patient diary Numeric Rating Scale ( NRS ) scores from baseline to final visit of the 12-week double-blind phase . Secondary endpoints included mean change from baseline to each visit in patient diary NRS scores ; and office NRS scores ; time to treatment failure ; Patient Global Assessment ; rescue medication use ; and Rol and Morris Disability Question naire total scores . Clinical Trial Registration : Trial registration : Clinical Trials.gov identifier : NCT00549042 . Results : For the primary outcome measure , hydromorphone ER significantly reduced pain intensity compared to placebo ( p < 0.001 ) . Median diary NRS score change from baseline to endpoint was significantly lower for hydromorphone ER ( 0.2 units ) compared to placebo ( 1.2 units ) . A significantly higher proportion of hydromorphone ER ( 60.6 % ) vs. placebo ( 42.9 % ) patients had at least a 30 % reduction in diary NRS pain score from screening to endpoint ( p < 0.01 ) . Hydromorphone ER was well-tolerated , although 60 ( 13 % ) discontinued during the enrichment phase for adverse events and more active ( 9 , 6.7 % ) than placebo ( 4 , 3.0 % ) patients discontinued treatment for adverse events during the r and omized phase . Conclusions : These results provide evidence for the efficacy and safety of hydromorphone ER in opioid-tolerant patients with chronic moderate-to-severe LBP . Potential limitations include the shortened dose-conversion/titration phase , limiting the daily allowable dose of hydromorphone ER to 64 mg , and the allowance of limited rescue medication throughout the entire double-blind phase . Other trial design elements such as the use of an enrichment phase and the inclusion of only opioid tolerant patients may limit the generalizabilty of these results Objective Chronic low back pain ( CLBP ) is a widespread ailment . The aim of this study was to assess the efficacy of topiramate in the treatment of CLBP and the changes in anger status and processing , body weight , subjective pain-related disability and health-related quality of life during the course of treatment . Methods We conducted a 10-week , r and omized , double-blind , placebo-controlled study of topiramate in 96 ( 36 women ) patients with CLBP . The subjects were r and omly assigned to topiramate ( n=48 ) or placebo ( n=48 ) . Primary outcome measures were changes on the McGill Pain Question naire , State-Trait Anger Expression Inventory , Oswestry Low Back Pain Disability Question naire and SF-36 Health Survey scales , and in body weight . Results In comparison with the placebo group ( according to the intent-to-treat principle ) , significant changes on the pain rating index of McGill Pain Question naire ( Ps<0.001 ) , State-Trait Anger Expression Inventory Scales ( all Ps<0.001 ) , Oswestry Low Back Pain Disability Question naire ( P<0.001 ) , and SF-36 Health Survey scales ( all P<0.001 , except on the role-emotional scale ) were observed after 10 weeks in the patients treated with topiramate . Weight loss was also observed and was significantly more pronounced in the group treated with topiramate than in those treated with placebo ( P<0.001 ) . Most patients tolerated topiramate relatively well but 2 patients dropped out because of side effects . Discussion Topiramate seems to be a relatively safe and effective agent in the treatment of CLBP . Significantly positive changes in pain sensitivity , anger status and processing , subjective disability , health-related quality of life , and loss of weight were observed IMPORTANCE Oral steroids are commonly used to treat acute sciatica due to a herniated disk but have not been evaluated in an appropriately powered clinical trial . OBJECTIVE To determine if oral prednisone is more effective than placebo in improving function and pain among patients with acute sciatica . DESIGN , SETTING , AND PARTICIPANTS R and omized , double-blind , placebo-controlled clinical trial conducted from 2008 to 2013 in a large integrated health care delivery system in Northern California . Adults ( n=269 ) with radicular pain for 3 months or less , an Oswestry Disability Index ( ODI ) score of 30 or higher ( range , 0 - 100 ; higher scores indicate greater dysfunction ) , and a herniated disk confirmed by magnetic resonance imaging were eligible . INTERVENTIONS Participants were r and omly assigned in a 2:1 ratio to receive a tapering 15-day course of oral prednisone ( 5 days each of 60 mg , 40 mg , and 20 mg ; total cumulative dose = 600 mg ; n = 181 ) or matching placebo ( n = 88 ) . MAIN OUTCOMES AND MEASURES The primary outcome was ODI change at 3 weeks ; secondary outcomes were ODI change at 1 year , change in lower extremity pain ( measured on a 0 - 10 scale ; higher scores indicate more pain ) , spine surgery , and Short Form 36 Health Survey ( SF-36 ) Physical Component Summary ( PCS ) and Mental Component Summary ( MCS ) scores ( 0 - 100 scale ; higher scores better ) . RESULTS Observed baseline and 3-week mean ODI scores were 51.2 and 32.2 for the prednisone group and 51.1 and 37.5 for the placebo group , respectively . The prednisone-treated group showed an adjusted mean 6.4-point ( 95 % CI , 1.9 - 10.9 ; P = .006 ) greater improvement in ODI scores at 3 weeks than the placebo group and a mean 7.4-point ( 95 % CI , 2.2 - 12.5 ; P = .005 ) greater improvement at 52 weeks . Compared with the placebo group , the prednisone group showed an adjusted mean 0.3-point ( 95 % CI , -0.4 to 1.0 ; P = .34 ) greater reduction in pain at 3 weeks and a mean 0.6-point ( 95 % CI , -0.2 to 1.3 ; P = .15 ) greater reduction at 52 weeks . The prednisone group showed an adjusted mean 3.3-point ( 95 % CI , 1.3 - 5.2 ; P = .001 ) greater improvement in the SF-36 PCS score at 3 weeks , no difference in the SF-36 PCS score at 52 weeks ( mean , 2.5 ; 95 % CI , -0.3 to 5.4 ; P = .08 ) , no change in the SF-36 MCS score at 3 weeks ( mean , 2.2 ; 95 % CI , -0.4 to 4.8 ; P = .10 ) , and an adjusted 3.6-point ( 95 % CI , 0.6 - 6.7 ; P = .02 ) greater improvement in the SF-36 MCS score at 52 weeks . There were no differences in surgery rates at 52-week follow-up . Having 1 or more adverse events at 3-week follow-up was more common in the prednisone group than in the placebo group ( 49.2 % vs 23.9 % ; P < .001 ) . CONCLUSIONS AND RELEVANCE Among patients with acute radiculopathy due to a herniated lumbar disk , a short course of oral steroids , compared with placebo , result ed in modestly improved function and no improvement in pain . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00668434 OBJECTIVE Our purpose was to measure the agreement , reliability , construct validity , and feasibility of a measurement tool to assess systematic review s ( AMSTAR ) . STUDY DESIGN AND SETTING We r and omly selected 30 systematic review s from a data base . Each was assessed by two review ers using : ( 1 ) the enhanced quality assessment question naire ( Overview of Quality Assessment Question naire [ OQAQ ] ) ; ( 2 ) Sacks ' instrument ; and ( 3 ) our newly developed measurement tool ( AMSTAR ) . We report on reliability ( interobserver kappas of the 11 AMSTAR items ) , intraclass correlation coefficients ( ICCs ) of the sum scores , construct validity ( ICCs of the sum scores of AMSTAR compared with those of other instruments ) , and completion times . RESULTS The interrater agreement of the individual items of AMSTAR was substantial with a mean kappa of 0.70 ( 95 % confidence interval [ CI ] : 0.57 , 0.83 ) ( range : 0.38 - 1.0 ) . Kappas recorded for the other instruments were 0.63 ( 95 % CI : 0.38 , 0.78 ) for enhanced OQAQ and 0.40 ( 95 % CI : 0.29 , 0.50 ) for the Sacks ' instrument . The ICC of the total score for AMSTAR was 0.84 ( 95 % CI : 0.65 , 0.92 ) compared with 0.91 ( 95 % CI : 0.82 , 0.96 ) for OQAQ and 0.86 ( 95 % CI : 0.71 , 0.94 ) for the Sacks ' instrument . AMSTAR proved easy to apply , each review taking about 15 minutes to complete . CONCLUSIONS AMSTAR has good agreement , reliability , construct validity , and feasibility . These findings need confirmation by a broader range of assessors and a more diverse range of review Two 6-week studies compared the analgesic efficacy , tolerability and safety of a non-steroidal anti-inflammatory drug ( celecoxib 200 mg twice a day [ bid ] ) and an opioid ( tramadol HCl 50 mg four times a day [ qid ] ) in subjects with chronic low-back pain ( CLBP ) . Successful responders ( primary endpoint ) were defined as subjects completing 6 weeks of treatment and having ≥ 30 % improvement on the Numerical Rating Scale for pain . A total of 796 and 802 subjects were r and omized to treatment in study 1 and study 2 , respectively . A significantly greater percentage of celecoxib-treated subjects were successful responders compared with tramadol HCl-treated subjects ( study 1 : 63.2 % versus 49.9 % , respectively ; study 2 : 64.1 % versus 55.1 % , respectively ) . Fewer adverse events ( AEs ) and serious AEs were reported in the celecoxib-treated group . Overall , celecoxib 200 mg bid was more effective than tramadol HCl 50 mg qid in the treatment of CLBP , with fewer AEs reported Purpose Treatment of patients with chronic low back pain ( CLBP ) aims to reduce disability , improve functional capacity , and participation . Time contingent prescription of analgesics is a treatment modality in CLBP . The impact of analgesics on functional capacity is unknown . Aim of the study was to explore the effect of analgesics on functioning measured by functional capacity evaluation , and self-reported disability in patients with CLBP . Methods Explorative R and omized Placebo-Controlled Clinical Trial was performed in an outpatient pain rehabilitation setting on patients waiting for rehabilitation . Included patients had low back pain lasting > 3 months , visual analogue scale worst pain ≥4.0 cm , and age > 18 years . Outcome measures before ( T0 ) and after treatment ( T1 ) : functional capacity , pain intensity , Rol and Morris Disability Question naire . T1 : global perceived pain relief . Patient characteristics and psychological question naires were assessed . Fifty patients were included in this study and were r and omly assigned to 2 weeks treatment or placebo . Treatment : acetaminophen/tramadol 325 mg/37.5 mg per capsule . Dose : maximum acetaminophen 1,950 mg and tramadol 225 mg per day ; treatment and placebo titrated identically . Compliance and side-effects were monitored . Treatment effects between groups over time were compared . Results One patient ( treatment group ) was lost to follow-up . Forty-nine patients remained in the study . Treatment effects in primary outcomes did not differ significantly between groups . A subgroup of 10 ( 42 % ) patients ( treatment group ) reported global pain relief ( responders ) who reduced self-reported disability ( p < 0.05 ) . Responders had significantly lower catastrophizing scores . Conclusion Overall treatment effects were small and non-significant . A subgroup , however , reported improved functioning as a result of treatment . Responders had lower catastrophizing scores Abstract Objective : This study examines the efficacy of the buprenorphine transdermal system ( BTDS ) for reducing the interference of pain on physical and emotional functioning associated with chronic low back pain ( CLBP ) . Methods : A post-hoc analysis used data from a r and omized , placebo-controlled , double-blind trial of patients with moderate-to-severe CLBP . The Brief Pain Inventory ( BPI ) measured pain interference at screening , following a run-in period , and during the 12-week double-blind treatment phase . Statistical analyses examined treatment arm differences ( BTDS vs placebo ) for the following : BPI Interference subscale items and subscale scores at the trial end point ( week 12 ) ; patterns of change in the Interference subscale scores over time ; proportions of patients indicating mild or no interference following treatment ; and proportions of patients showing improvement ( 30 % , 50 % , 2-point , or 4-point change in score from screening to week 12 ) for each item and subscale . Results : Mean scores for BPI Interference items and Interference subscale were significantly lower ( ie , indicated less interference ) for BTDS than for placebo ( all P < 0.001 ) . Treatment arm differences in Interference subscale scores emerged within 4 weeks of treatment . The BTDS patients were significantly more likely to indicate mild/no interference on 5 of 7 Interference subscale items following treatment ( P < 0.05 ) . For most comparisons , BTDS patients were significantly more likely to show criterion-level improvements in Interference item and subscale scores ( P < 0.05 for differences ) . Discussion : Results indicate the efficacy of BTDS treatment , compared with placebo , for reducing the interference of pain on physical and emotional functioning in patients with moderate-to-severe CLBP . The advantage of BTDS was observed within 4 weeks of treatment , and was maintained throughout the 12-week treatment phase Background The efficacy and safety of the association of celecoxib [ a selective cyclooxygenase-2 ( COX-2 ) inhibitor ] and pregabalin ( commonly used to control neuropathic pain ) , compared with monotherapy of each , were evaluated for the treatment of chronic low-back pain , a condition known to be due to neuropathic as well as nociceptive pain mechanisms . Material s and methods In this prospect i ve r and omized trial , 36 patients received three consecutive 4-week treatment regimes , r and omly assigned : celecoxib plus placebo , pregabalin plus placebo , and celecoxib plus pregabalin . All patients were assessed by using a visual analogue scale ( VAS , 0–100 mm ) and the Leeds Assessment of Neuropathic Symptoms and Signs ( LANSS ) pain scale by an investigator blinded to the administered pharmacological treatment . Results Celecoxib and pregabalin were effective in reducing low-back pain when patients were pooled according to LANSS score . The association of celecoxib and pregabalin was more effective than either monotherapy in a mixed population of patients with chronic low-back pain and when data were pooled according to LANSS score . Adverse effects of drug association and monotherapies were similar , with reduced drug consumption in the combined therapy . Conclusions Combination of celecoxib and pregabalin is more effective than monotherapy for chronic low-back pain , with similar adverse effects UNLABELLED Opioid-experienced ( N = 250 ) patients with chronic , moderate to severe low back pain ( LBP ) were converted from their pre study opioid(s ) to an approximately equianalgesic dose of OPANA ER ( oxymorphone extended release ) . Patients continued slow titration , with 56 % stabilized within 1 month to a dose of OPANA ER that reduced average pain to < 40 mm on a visual analog scale with good tolerability . Stabilized patients ( n = 143 ) were r and omized to placebo or their stabilized dose of OPANA ER every 12 hours for a 12-week double-blind period . Pain intensity increased significantly more for patients r and omized to placebo than for patients who continued their stabilized dose of OPANA ER ; the increase from baseline ( at r and omization ) to final visit was 31.6 mm for placebo versus 8.7 mm with OPANA ER ( P < .0001 ) . During double-blind treatment , placebo patients were approximately 8-fold more likely than OPANA ER patients to discontinue because of lack of efficacy ( P < .001 ) . Discontinuations as a result of adverse events were similar between groups , 10 % with placebo and 11 % with OPANA ER . Opioid-related adverse events included constipation ( 6 % ) , somnolence ( 3 % ) , and nausea ( 3 % ) . Fifty-seven percent of opioid-experienced patients with chronic , moderate to severe LBP achieved a stable dose of OPANA ER that was efficacious and generally well-tolerated for up to 12 weeks . PERSPECTIVE In a 12-week , double-blind , r and omized , placebo-controlled trial in opioid-experienced patients with chronic , moderate to severe LBP , OPANA ER provided efficacious , long-term analgesia and was generally well-tolerated . OPANA ER may provide clinicians with a new treatment option for patients experiencing suboptimal analgesic responses or poor tolerability with other opioids Previous studies have shown a positive association between pain and depression , though evidence supporting a direct link between these two variables is less robust . Using a placebo-controlled trial , the authors examined the analgesic and antidepressant efficacy of paroxetine ( 20 mg ) in chronic low back pain sufferers . The authors examined the associations among pain , depression , disability , and illness attitudes . Paroxetine showed no effects on pain or depression compared with placebo ; however , subjects r and omized to paroxetine were more likely to reduce concomitant analgesic medication . The cross-sectional association of depression and pain at baseline ( r = 0.2 , P = 0.02 ) was weaker than the association between depression and disability ( r = 0.3 , P = 0.004 ) . Similarly , the association of change in depression scores with change in pain ( r = 0.25 , P = 0.016 ) was weaker than change between depression and disability ( r = 0.49 , P<0.0005 ) . Whereas the relationship between pain and depression became nonsignificant when disability and illness attitudes were controlled , the relationship between depression and disability remained highly significant when pain and illness attitudes were controlled . These data are consistent with the association between pain and depression being wholly modulated by disability and illness attitudes , with no direct relationship between pain and depression Purpose Pregabalin and opioids are used to treat chronic low back pain ( LBP ) . No previous investigations have compared the efficacy of pregabalin and that of opioids for chronic LBP . Methods We performed a r and omized controlled trial of pregabalin and opioids in 65 consecutive patients aged 65 years or older who had chronic LBP . Each agent was administered r and omly in different phases . Pain and activities of daily living ( ADL ) were evaluated after 4 weeks of treatment using the visual analog scale , Japanese Orthopaedic Association score , Rol and Morris Disability Question naire , short-form McGill Pain Question naire , EuroQol quality -of-life scale , and geriatric depression scale . Neuropathic pain was evaluated using a neuropathic pain screening question naire . Results The effectiveness rate was 73.3 % for pregabalin and 83.3 % for opioid , showing no significant difference . The mean duration s until the onset of effect were 10.2 and 6.1 days , respectively , albeit without significant difference . Pregabalin was effective for LBP with neuropathic pain , whereas opioids were effective for non-neuropathic pain . The improvement of ADL was greater with opioids than with pregabalin . Pregabalin was effective for LBP in patients with lower limb symptoms , whereas opioids were effective for those without lower limb symptoms . Conclusions Aside from screening tests , consideration of neuropathic pain and lower extremity symptoms may be an integral component in the selection of the appropriate medication for chronic LBP . Moreover , the therapeutic objectives , including pain relief and /or improvement of ADL , should be specified BACKGROUND Buprenorphine is a mixed-activity , partial mu-opioid agonist . Its lipid solubility makes it well suited for transdermal administration . OBJECTIVE This study assessed the efficacy and safety profile of a 7-day buprenorphine transdermal system ( BTDS ) in adult ( age > 18 years ) patients with moderate to severe chronic low back pain previously treated with > or = 1 tablet daily of an opioid analgesic . METHODS This was a r and omized , double-blind , placebo-controlled crossover study , followed by an open-label extension phase . After a 2- to 7-day washout of previous opioid therapy , eligible patients were r and omized to receive BTDS 10 microg/h or matching placebo patches . The dose was titrated weekly using 10- and 20-microg/h patches ( maximum , 40 microg/h ) based on efficacy and tolerability . After 4 weeks , patients crossed over to the alternative treatment for another 4 weeks . Patients who completed the double-blind study were eligible to enter the 6-month open-label phase . Rescue analgesia was provided as acetaminophen 325 mg to be taken as 1 or 2 tablets every 4 to 6 hours as needed . The primary outcome assessment s were daily pain intensity , measured on a 100-mm visual analog scale ( VAS ) , from no pain to excruciating pain , and a 5-point ordinal scale , from 0 = none to 4 = excruciating . Secondary outcome assessment s included the Pain and Sleep Question naire ( 100-mm VAS , from never to always ) , Pain Disability Index ( ordinal scale , from 0 = no disability to 11 = total disability ) , Quebec Back Pain Disability Scale ( categorical scale , from 0 = no difficulty to 5 = unable to do ) , and the 36-item Short Form Health Survey ( SF-36 ) . Patients and investigators assessed overall treatment effectiveness at the end of each phase ; they assessed treatment preference at the end of double-blind treatment . After implementation of a pre caution ary amendment , the QTc interval was measured 3 to 4 days after r and omization and after any dose adjustment . All assessment s performed during the double-blind phase were also performed every 2 months during the open-label extension . Adverse events were collected by non-directed question ing throughout the study . RESULTS Of 78 r and omized patients , 52 ( 66.7 % ) completed at least 2 consecutive weeks of treatment in each study phase without major protocol violations ( per- protocol [ PP ] population : 32 women , 20 men ; mean [ SD ] age , 51.3 [ 11.4 ] years ; mean weight , 85.5 [ 19.5 ] kg ; 94 % white , 4 % black , 2 % other ) . The mean ( SD ) dose of study medication during the last week of treatment was 29.8 ( 12.1 ) microg/h for BTDS and 32.9 ( 10.7 ) microg/h for placebo ( P = NS ) . During the last week of treatment , BTDS was associated with significantly lower mean ( SD ) pain intensity scores compared with placebo on both the VAS ( 45.3 [ 21.3 ] vs 53.1 [ 24.3 ] mm , respectively ; P = 0.022 ) and the 5-point ordinal scale ( 1.9 [ 0.7 ] vs 2.2 [ 0.8 ] ; P = 0.044 ) . The overall Pain and Sleep score was significantly lower with BTDS than with placebo ( 177.6 [ 125.5 ] vs 232.9 [ 131.9 ] ; P = 0.027 ) . There were no treatment differences on the Pain Disability Index , Quebec Back Pain Disability Scale , or SF-36 ; however , BTDS was associated with significant improvements compared with placebo on 2 individual Quebec Back Pain Disability Scale items ( get out of bed : P = 0.042 ; sit in a chair for several hours : P = 0.022 ) . Of the 48 patients /physicians in the PP population who rated the effectiveness of treatment , 64.6 % of patients ( n = 31 ) rated BTDS moderately or highly effective , as did 62.5 % of investigators ( n = 30 ) . Among the 50 patients in the PP population who answered the preference question , 66.0 % of patients ( n = 33 ) preferred the phase in which they received BTDS and 24.0 % ( n = 12 ) preferred the phase in which they received placebo ( P = 0.001 ) , with the remainder having no preference ; among investigators , 60.0 % ( n = 30 ) and 28.0 % ( n = 14 ) preferred the BTDS and placebo phases , respectively ( P = 0.008 ) , with the remainder having no preference . The mean placebo-adjusted change from baseline in the QTc interval ranged from -0.8 to + 3.8 milliseconds ( P = NS ) . BTDS treatment was associated with a significantly higher frequency of nausea ( P < 0.001 ) , dizziness ( P < 0.001 ) , vomiting ( P = 0.008 ) , somnolence ( P = 0.020 ) , and dry mouth ( P = 0.003 ) , but not constipation . Of the 49 patients completing 8 weeks of double-blind treatment , 40 ( 81.6 % ) entered the 6-month , open-label extension study and 27 completed it . Improvements in pain scores achieved during the double-blind phase were maintained in these patients . CONCLUSIONS In the 8-week , double-blind portion of this study , BTDS 10 to 40 microg/h was effective compared with placebo in the management of chronic , moderate to severe low back pain in patients who had previously received opioids . The improvements in pain scores were sustained throughout the 6-month , open-label extension . ( Current Controlled Trials identification number : IS RCT N 06013881 ) OBJECTIVE To evaluate the effectiveness and tolerability of tapentadol PR monotherapy versus tapentadol PR/pregabalin combination therapy for severe , chronic low back pain with a neuropathic component . METHODS Eligible patients had painDETECT " unclear " or " positive " ratings and average pain intensity ≥ 6 ( 11-point NRS-3 [ average 3-day pain intensity ] ) at baseline . Patients were titrated to tapentadol PR 300 mg/day over 3 weeks . Patients with ≥ 1-point decrease in pain intensity and average pain intensity ≥ 4 were r and omized to tapentadol PR ( 500 mg/day ) or tapentadol PR ( 300 mg/day)/pregabalin ( 300 mg/day ) during an 8-week comparative period . RESULTS In the per- protocol population ( n = 288 ) , the effectiveness of tapentadol PR was clinical ly and statistically comparable to tapentadol PR/pregabalin based on the change in pain intensity from r and omization to final evaluation ( LOCF ; LSMD [ 95 % CI ] , -0.066 [ -0.57 , 0.43 ] ; P < 0.0001 for noninferiority ) . Neuropathic pain and quality -of-life measures improved significantly in both groups . Tolerability was good in both groups , in line with prior trials in the high dose range of 500 mg/day for tapentadol PR monotherapy , and favorable compared with historical combination trials of strong opioids and anticonvulsants for combination therapy . The incidence of the composite of dizziness and /or somnolence was significantly lower with tapentadol PR ( 16.9 % ) than tapentadol PR/pregabalin ( 27.0 % ; P = 0.0302 ) . CONCLUSIONS Tapentadol PR 500 mg is associated with comparable improvements in pain intensity and quality -of-life measures to tapentadol PR 300 mg/pregabalin 300 mg , with improved central nervous system tolerability , suggesting that tapentadol PR monotherapy may offer a favorable treatment option for severe low back pain with a neuropathic component ABSTRACT Objective : To assess the long-term efficacy , tolerability and safety of polymer-coated extended- release morphine sulfate ( P-ERMS ) ( KADIAN * ) compared with controlled-release oxycodone HCl ( CRO ) ( OxyContin† ) in treating chronic , nonmalignant , moderate to severe pain in a community- based outpatient population . * KADIAN is a licensed trademark of Alpharma Br and ed Products Division Inc. , Piscataway , NJ , USA † OxyContin is a registered trademark of Purdue Pharma L.P. , Stamford , CT , NJ , USA Design : Phase IV , prospect i ve , r and omized , open-label . Participants : Adults ( N = 112 ) with chronic , nonmalignant , moderate to severe pain with visual numeric scale ( VNS ) scores ≥ 4 ( 0 = no pain ; 10 = worst pain ) . Interventions : Patients were r and omized to receive either P-ERMS once-daily ( QD ) dosing or CRO twice-daily ( BID ) dosing for a 24‐week treatment period . Upward titration of dose and switching P-ERMS to BID or CRO to thrice-daily ( TID ) dosing was allowed Weeks 2–24 . Main outcome measures : Quality of life ( Physical [ PCS ] and Mental [ MCS ] Component Summary scores of the SF-36v2 Health Survey ) , pain and sleep scores ( 0–10 ) , and patient and clinician assessment s of current therapy ( –4 to + 4 ) . Results : Patients in both treatment groups experienced significant improvements in PCS scores ( P-ERMS , + 2.6 ; CRO , + 3.1 ; p < 0.05 vs. baseline ) ; patients taking CRO also demonstrated improvements in MCS scores ( + 4.7 , p < 0.05 vs. baseline ) . Both groups attained significant reductions from baseline to 24 weeks in pain ( P-ERMS , –2.0 ; CRO , –1.4 ; p ≤ 0.001 vs. baseline ) ; the reduction with P-ERMS was clinical ly meaningful ( as defined by at least a 2‐point reduction in VNS score ) . Patients attained significant improvement in sleep scores ( P-ERMS , –2.6 ; CRO , –1.6 ; p < 0.001 vs. baseline ; p < 0.05 , P-ERMS vs. CRO ) . At Week 24 , both groups indicated significantly increased patient ( P-ERMS , + 2.6 ; CRO , + 1.7 ; p < 0.001 vs. baseline ) and clinician ( P-ERMS , + 4.0 ; CRO , + 3.1 ; p < 0.001 vs. baseline ) global assessment s of therapy . After 24 weeks , all patients on P-ERMS were dosing within the FDA -approved frequencies ( 65 % QD , 35 % BID ) ; 56 % of patients on CRO dosed BID , but 38 % dosed TID and 6 % dosed four times daily ( QID ) . Most common adverse events were constipation , nausea , and somnolence , with no significant difference between treatment groups . Conclusions : P-ERMS and CRO both relieved chronic nonmalignant pain in this community-based population ; however , patients taking P-ERMS dosed in accordance with FDA -approved frequencies ( QD/BID ) ; 44 % of those taking CRO dosed more frequently ( TID/QID ) Two hundred and sixty patients with lumbago or sciatic pain participated in a multicenter observer-blind r and omized trial to compare the efficacy and tolerability of dipyrone 2.5 g , diclofenac 75 mg , and placebo administered as an intramuscular injection once daily for the duration of one to two days . The effectiveness of the test treatments in relieving sciatic pain was measured by a visual analog scale ( VAS ) before and 30 minutes , 1 , 2 , 3 , 6 and 24 hours after each injection . In addition , the patient 's general well-being was measured on a 5-point rating scale on day 0 , 1 and 2 . At the end of the trial , the patients evaluated the overall efficacy of the study drugs on a 5-point rating scale . Minimal finger-toe distance was measured every day of the trial . Pain intensity on VAS ( primary endpoint ) showed a significantly greater reduction with dipyrone than with diclofenac or placebo between 1 and 6 hours after application ( p < 0.01 ) and at the end of the trial ( after 48 hours ) . Improvement in general well-being and minimal finger-toe distance was greatest in the dipyrone group . 59 % of the patients with dipyrone assessed the overall efficacy as " excellent " or " very good " , compared with 30 % with diclofenac , and 18 % with placebo . Adverse reactions were reported in only 7 patients ( 3 % ) , 4 ( 5 % ) in the dipyrone , 1 ( 1 % ) in the diclofenac , and 2 ( 2 % ) in the placebo group UNLABELLED SUMMARY AIM The aim of this study was to evaluate the efficacy and safety of combined transdermal buprenorphine and pregabalin in chronic low back pain . PATIENTS & METHODS A total of 45 patients with chronic low back pain were recruited into the study . For an initial 3-week period , all patients received transdermal buprenorphine 35 µg/h . After 3 weeks of only transdermal buprenorphine 35 µg/h , patients were r and omized ( single-blind ) to receive transdermal buprenorphine 35 µg/h plus pregabalin 300 mg/day ( group A ) or transdermal buprenorphine 35 µg/h plus placebo ( group B ) , and were observed for a further 3-week period . Efficacy parameters were weekly mean Visual Analog Scale ( VAS ) scores , the Pain Rating Index ( PRI ) of the Short-Form McGill Pain Question naire ( SF-MPQ ) , the Present Pain Index ( PPI ) of the SF-MPQ and sleep interference . We also evaluated the use of rescue medication ( paracetamol [ acetaminophen ] ) and the presence of adverse events . RESULTS A total of 44 patients were evaluated for efficacy and safety parameters . Pain relief , as assessed by VAS , PPI and PRI , improved significantly ( p < 0.05 ) in all patients after the first week of treatment with only transdermal buprenorphine . Following r and omization , only patients in group A showed further reductions in the mean VAS , PPI and PRI scores . Moreover , patients in group A had a lower consumption of rescue medication than those in group B. There was a low incidence of mild adverse events in both group A and group B , with no serious adverse events in either group . CONCLUSION Pregabalin 300 mg/day as an add-on to transdermal buprenorphine 35 µg/h led to significant pain reduction and a significant reduction of interference with sleep quality in patients with chronic low back pain AIM The efficacy and safety of oral lornoxicam ( LNX ) as early treatment of acute sciatica/lumbo-sciatica was compared with placebo and diclofenac in a 5-day double-blind , r and omised study . METHODS Male or female patients ( n = 171 ) aged 18 - 70 years with acute sciatica or lumbo-sciatica [ acute sciatica defined as typical radiation of pain along the sciatic nerve ( including radiating pain below the knee ) and worsening of pain as defined by Lasegue 's leg-raising test ( < 60 degrees ) within 72 h and previous attack ceased > 3 months previously ; lumbo-sciatica defined as symptoms of sciatica with concurrent lumbar pain and a predefined minimum pain score ] . The dosage of study treatment was 8 - 24 mg/day LNX , 100 - 150 mg/day diclofenac or placebo . The primary end-point was the difference in pain intensity difference from baseline to 6 h ( PID(0 - 6 h ) ) after the first dose of study treatment . Secondary end-points were pain relief , the cumulative sums of pain intensity difference and total pain relief on day 1 and on days 2 - 4 . RESULTS In total , 164 patients completed the study . Significant differences in PID between LNX and placebo were seen in the time interval 3 - 8 h after the first dose including PID(0 - 6 h ) ( p = 0.015 ) . Secondary end-points favoured LNX vs. placebo , but in general were not significantly different . LNX and diclofenac had similar analgesic effect . Incidence and severity of adverse events were comparable for the three treatments ; overall tolerability was rated as very good/good by 93 % of the patients . CONCLUSION These data indicate that the analgesic efficacy of LNX is superior to placebo and similar to diclofenac in acute sciatica/lumbo-sciatica Prompted by previously results of systemic steroid treatment of symptoms of prolapsed lumbar disc , which appeared to be promising , dexamethasone phosphate was tested in a controlled , double-blind clinical study . 52 patients were included in the study : 25 received dexamethasone phosphate , 24 received placebo and 3 patients dropped out . Upon termination of the study the two groups were found to be comparable as to age , sex , duration of symptoms and degree of heavy work . We were not able to demonstrate any effect of dexamethasone phosphate on the following parameters pain , paraesthesia , paresismuscular weakness , disturbances of reflexes , and Laseque . Nor was there any difference in the length of hospitalization Although not recommended for low back pain , the efficacy of systemic corticosteroids has never been evaluated in a general low back pain population . To test the efficacy of systemic corticosteroids for Emergency Department ( ED ) patients with low back pain , a r and omized , double-blind , placebo-controlled trial of long-acting methylprednisolone was conducted with follow-up assessment 1 month after ED discharge . Patients with non-traumatic low back pain were included if their straight leg raise test was negative . The primary outcome was a comparison of the change in a numerical rating scale ( NRS ) 1 month after discharge . Of 87 subjects r and omized , 86 were successfully followed to the 1-month endpoint . The change in NRS between discharge and 1 month differed between the two groups by 0.6 ( 95 % confidence interval -1.0 to 2.2 ) , a clinical ly and statistically insignificant difference . Disability , medication use , and healthcare re sources utilized were comparable in both groups . Corticosteroids do not seem to benefit patients with acute non-radicular low back pain The roles of bedrest , antiinflammatory medication , and analgesic medication in the treatment of acute back strain were objective ly analyzed to determine whether they have a measurable effect on the return of patients to full daily activities as well as on the relief of pain . Two hundred patients were studied prospect ively . Each patient had the diagnosis of acute back strain , which was defined as nonradiating low-back pain . The results of the patient 's neurologic examination , straight leg raising test , and lumbosacral spine roentgenograms had to be within normal limits for the patient to be included in the study . The results showed that bedrest , as compared with ambulation , will decrease the amount of time lost from work by 50 % . Bedrest will also decrease the amount of discomfort by 60 % . Analgesic medication , when combined with bedrest , will further decrease the amount of pain incurred , particularly when used in the first three days of the healing process . However , analgesic medication will not allow a more prompt return to work . Antiinflammatory medication , when added to bedrest in the treatment of lumbago , does not provide an advantage over bedrest alone & NA ; Patients with acute lumbar disc prolapse with sciatica who are not considered c and i date s for surgery are usually treated with physiotherapy and non‐steroidal anti‐inflammatory agents . Moreover , the treatment with benzodiazepines is common practice in the absence of class I or II level of evidence . Here we assessed the role of benzodiazepines in the conservative management of acute lumbar disc prolapse . Using a placebo‐controlled , double‐blinded design , 60 patients were r and omized to receive placebo or diazepam in addition to mechanical physiotherapy and analgesics for the first 7 days of conservative treatment of clinical ly and radiologically confirmed lumbar disc prolapse . The primary objective was to evaluate if physiotherapy plus analgesics , but without benzodiazepines , is equivalent to the same therapy plus benzodiazepines . The primary endpoint was central ization of referred pain at day 7 . Twenty‐six female and 34 male patients were enrolled . The median age was 42 years ( range 22–68 years ) . Analysis of the primary endpoint demonstrated equivalence between placebo and diazepam ( median 60 % vs. 50 % reduction of distance of referred pain at day 7 ) within the predefined equivalence tolerance of 20 % at a significance level of p < 0.05 . Regarding the secondary endpoints , the median duration of the stay in hospital was shorter in the placebo arm ( 8 vs. 10 days , p = 0.008 ) , and the probability of pain reduction on a visual analog scale by more than 50 % was twice as high in placebo patients ( p < 0.0015 ) . Benzodiazepines should not be used routinely in patients treated with mechanical physiotherapy for lumbar disc prolapse Abstract : Objective : Two r and omised , double-blind , double-dummy trials evaluated the efficacy and tolerability of meloxicam compared with placebo or diclofenac in patients with acute sciatica.¶Subjects : 1021 patients with acute sciatica.¶Treatment and methods : In the first study , 532 patients received meloxicam 7.5 mg , meloxicam 15 mg , or placebo for 7 days . The second study r and omised 489 patients to meloxicam 7.5 mg , meloxicam 15 mg , or diclofenac 150 mg for 14 days.¶ Results : Meloxicam 7.5 mg and 15 mg significantly improved overall pain between baseline and day 7 ( p<0.05 ) compared with placebo . Furthermore , both meloxicam doses showed similar improvements on all primary and secondary efficacy endpoints compared with diclofenac 150 mg . No significant differences in tolerability were observed between any of the treatment groups in either study .¶ Conclusions : Meloxicam ( 7.5 mg or 15 mg ) was well tolerated and was more effective than placebo , and as effective as diclofenac , in acute UNLABELLED This multicenter , r and omized , double-blind , placebo- and active-controlled trial was conducted to compare the analgesic efficacy and safety of oxymorphone extended release ( ER ) with placebo and oxycodone controlled release ( CR ) in ambulatory patients with moderate to severe chronic low back pain requiring opioid therapy . Patients ( N = 213 ) aged 18 to 75 years were r and omized to receive oxymorphone ER ( 10 to 110 mg ) or oxycodone CR ( 20 to 220 mg ) every 12 hours during a 7- to 14-day dose-titration phase . Patients achieving effective analgesia at a stable opioid dose entered an 18-day double-blind treatment phase and either continued opioid therapy or received placebo . With stable dosing throughout the treatment phase , oxymorphone ER ( 79.4 mg/day ) and oxycodone CR ( 155 mg/day ) were superior to placebo for change from baseline in pain intensity as measured on a visual analog scale ; the LS mean differences were -18.21 and 18.55 ( 95 % CI , -25.83 to -10.58 and -26.12 to -10.98 , respectively ; P = .0001 ) . Use of rescue medication was 20 mg per day . Adverse events for the active drugs were similar ; the most frequent were constipation and sedation . Oxymorphone ER and oxycodone CR were generally safe and effective for controlling low back pain . Oxymorphone ER was equianalgesic to oxycodone CR at half the milligram daily dosage , with comparable safety . PERSPECTIVE Definitive studies of long-acting opioids in patients with chronic low back pain are lacking . We report the results of a multicenter , r and omized , placebo-controlled , double-blind study evaluating the analgesic efficacy and safety of oxymorphone ER and oxycodone CR in opioid-experienced patients with chronic low back pain OBJECTIVE To compare the efficacy and safety of controlled-release oxycodone given every 12 hours with immediate-release oxycodone given four times daily in patients with persistent back pain . DESIGN R and omized , double-blind , active-controlled , two-period crossover trial . PATIENTS Fifty-seven adult out patients with stable , chronic , moderate-to-severe low back pain despite analgesic therapy were enrolled ; 47 were r and omized ; 11 discontinued for side effects , most commonly nausea and vomiting . INTERVENTIONS Controlled-release oxycodone tablets given every 12 hours ; immediate-release oxycodone tablets given four times daily ; dose titration with controlled-release or immediate-release for up to 10 days ; double-blind treatment for 4 - 7 days each . OUTCOME MEASURES Patients ' pain scores ( 0 = none , 1 = slight , 2 = moderate , 3 = severe ) . RESULTS Pain intensity decreased from moderate to severe at baseline to slight at the end of titration with both oxycodone formulations . The daily oxycodone dose was 40 mg or less in 68 % of patients . During double-blind treatment , mean pain intensity was maintained at 1.2 ( 0.1 SE ) with controlled-release and at 1.1 ( 0.1 SE ) with immediate-release oxycodone . The most common adverse events were constipation , nausea , pruritus , somnolence , and dizziness . CONCLUSIONS Controlled-release oxycodone given every 12 hours was comparable with immediate-release oxycodone given four times daily in efficacy and safety , and it provides convenient , twice-daily , around-the-clock treatment for selected patients with persistent back pain that is inadequately controlled by nonopioids or as-needed opioid therapy Previous reports concerning the treatment of symptoms deriving from prolapsed lumbar disc with systemic administration of the potent steroid dexamethasone , have shown favourable results . The present clinical study includes 39 patients with symptoms of prolapsed lumbar disc , treated in a controlled double‐blind investigation with dexamethasone or placebo . Twenty patients had provable effect of the treatment , 19 had no effect . Nineteen patients received dexamethasone ( 13 + effect ) , 20 received placebo ( 7 + effect ) . The groups were fully comparable , and the difference is not statistically significant . During a period of 3 months 50 % of the patients who improved by the treatment in either group had recurrences leading to operation . It is concluded that the effect of dexamethasone given intramuscularly does not seem to exceed that of placebo in the treatment of prolapsed lumbar disc Abstract Background : Opioid-induced constipation ( OIC ) is the most prevalent patient complaint associated with opioid use and interferes with analgesic efficacy . Objectives : This PROBE trial compares the overall safety and tolerability of oxycodone/naloxone ( OXN ) with those of traditional opioid therapy with oxycodone ( OXY ) or morphine ( MOR ) in the setting of the German healthcare system . Research design and methods : This was a prospect i ve , r and omized , open-label , blinded endpoint ( PROBE ) streamlined study ( German pain study registry : 2012 - 0012 - 05 ; EudraCT : 2012 - 001317 - 16 ) , carried out in 88 centers in Germany , where a total of 453 patients , requiring WHO step III opioids to treat low back pain , were r and omized to OXN , OXY or MOR ( 1:1:1 ) for 3 months . The primary outcome was the percentage of patients without adverse event-related study discontinuations who presented with a combination of a ≥50 % improvement of pain intensity , disability and quality -of-life and a ≤50 % worsening of bowel function at study end . Results : Significantly more OXN patients met the primary endpoint ( 22.2 % ) vs. OXY ( 9.3 % ; OR : 2.80 ; p < 0.001 ) vs. MOR ( 6.3 % ; OR : 4.23 ; p < 0.001 ) , with insignificant differences between OXY vs. MOR ( p = 0.155 ) . A ≥50 % improvement of pain intensity , functional disability and quality -of-life has been found for OXN in 75.0/61.1/66.0 % of patients and thus for all parameters significantly more than with OXY ( 58.9/49.0/48.3 ; p < 0.001 for each ) or MOR ( 52.5/46.2/37.3 ; p < 0.001 for each ) . A total of 86.8 % of OXN patients kept normal BFI scores during treatment , vs. 63.6 % for OXY ( p < 0.001 ) vs. 53.8 % for MOR ( p < 0.001 ) . Overall 189 TEAEs ( OXN : 45 , OXY : 69 , MOR : 75 ) in 92 patients ( OXN : 21 , OXY : 44 , MOR : 37 ) occurred , most gastrointestinal ( 50.8 % ) . One limitation is the open-label design , which presents the possibility of interpretive bias . Conclusion : Under the conditions of this PROBE design , OXN was associated with a significantly better tolerability , a lower risk of OIC and a significantly better analgesic efficacy than OXY or MOR The myotonolytic activity of a new muscle-relaxant , DS 103 - 282 , was compared with that of diazepam in a r and omized double-blind study on thirty patients suffering from acute muscular spasm due to disorders of the cervical and lumbar segments of the spine . Fifteen patients received 4 mg DS 103 - 282 and fifteen received 5 mg diazepam on a three times daily regime for 7 days . DS 103 - 282 was found to alleviate symptoms and improve mobility to a significant degree ( p ≥ 0.05 ) in all parameters evaluated and was also significantly superior to diazepam in 5 of these . Onset of action was particularly rapid for DS 103 - 282 . Both medications were well tolerated and there was no significant difference between them . On the basis of these data DS 103 - 282 may be considered a more powerful and faster-acting myotonolytic agent than diazepam with which it was compared in similar clinical indications BACKGROUND Regular paracetamol is the recommended first-line analgesic for acute low-back pain ; however , no high- quality evidence supports this recommendation . We aim ed to assess the efficacy of paracetamol taken regularly or as-needed to improve time to recovery from pain , compared with placebo , in patients with low-back pain . METHODS We did a multicentre , double-dummy , r and omised , placebo controlled trial across 235 primary care centres in Sydney , Australia , from Nov 11 , 2009 , to March 5 , 2013 . We r and omly allocated patients with acute low-back pain in a 1:1:1 ratio to receive up to 4 weeks of regular doses of paracetamol ( three times per day ; equivalent to 3990 mg paracetamol per day ) , as-needed doses of paracetamol ( taken when needed for pain relief ; maximum 4000 mg paracetamol per day ) , or placebo . R and omisation was done according to a central ised r and omisation schedule prepared by a research er who was not involved in patient recruitment or data collection . Patients and staff at all sites were masked to treatment allocation . All participants received best- evidence advice and were followed up for 3 months . The primary outcome was time until recovery from low-back pain , with recovery defined as a pain score of 0 or 1 ( on a 0 - 10 pain scale ) sustained for 7 consecutive days . All data were analysed by intention to treat . This study is registered with the Australian and New Zeal and Clinical Trial Registry , number ACTN 12609000966291 . FINDINGS 550 participants were assigned to the regular group ( 550 analysed ) , 549 were assigned to the as-needed group ( 546 analysed ) , and 553 were assigned to the placebo group ( 547 analysed ) . Median time to recovery was 17 days ( 95 % CI 14 - 19 ) in the regular group , 17 days ( 15 - 20 ) in the as-needed group , and 16 days ( 14 - 20 ) in the placebo group ( regular vs placebo hazard ratio 0·99 , 95 % CI 0·87 - 1·14 ; as-needed vs placebo 1·05 , 0·92 - 1·19 ; regular vs as-needed 1·05 , 0·92 - 1·20 ) . We recorded no difference between treatment groups for time to recovery ( adjusted p=0·79 ) . Adherence to regular tablets ( median tablets consumed per participant per day of maximum 6 ; 4·0 [ IQR 1·6 - 5·7 ] in the regular group , 3·9 [ 1·5 - 5·6 ] in the as-needed group , and 4·0 [ 1·5 - 5·7 ] in the placebo group ) , and number of participants reporting adverse events ( 99 [ 18·5 % ] in the regular group , 99 [ 18·7 % ] in the as-needed group , and 98 [ 18·5 % ] in the placebo group ) were similar between groups . INTERPRETATION Our findings suggest that regular or as-needed dosing with paracetamol does not affect recovery time compared with placebo in low-back pain , and question the universal endorsement of paracetamol in this patient group . FUNDING National Health and Medical Research Council of Australia and GlaxoSmithKline Australia & NA ; Tanezumab provides significantly greater improvement in pain , function , and global scores versus placebo and naproxen in patients with chronic low back pain . & NA ; Tanezumab is a humanized monoclonal antibody that specifically inhibits nerve growth factor as a treatment for chronic pain . This phase IIB study investigated the efficacy and safety of tanezumab for chronic low back pain vs placebo and naproxen . Patients ( N = 1347 ) received intravenous tanezumab ( 5 , 10 , or 20 mg every 8 weeks ) , naproxen ( 500 mg twice daily ) , or placebo . The primary efficacy end point was mean change in daily average low back pain intensity ( LBPI ) from baseline to week 16 . Secondary end points included mean change from baseline to week 16 in the Rol and Morris Disability Question naire and Patient ’s Global Assessment ( PGA ) of low back pain . Tanezumab 10 and 20 mg had similar efficacy profiles and significantly improved LBPI , Rol and Morris Disability Question naire , and PGA scores vs both placebo and naproxen ( P ≤ .05 ) . Tanezumab 5 mg provided improvement of PGA scores vs placebo ( P ≤ .05 ) , and naproxen result ed in significant improvement of LBPI vs placebo ( P ≤ .05 ) . Adverse event incidence was comparable across tanezumab doses but higher than with placebo or naproxen . Arthralgia , pain in extremity , headache , and paresthesia were the most commonly reported adverse events by tanezumab‐treated patients . The most frequently reported adverse events result ing in discontinuation of tanezumab treatment were arthralgia and paresthesia ; the highest frequency was observed with tanezumab 20 mg ( both 1.4 % ) . Serious adverse event incidence was similar across treatments . In conclusion , tanezumab provided significantly greater improvement in pain , function , and global scores vs placebo and naproxen in patients with chronic low back pain OBJECTIVE The purpose of this study was to evaluate the efficacy of controlled-release ( CR ) tramadol and immediate-release ( IR ) tramadol in patients with moderate or greater intensity chronic noncancer pain . METHODS A total of 122 patients underwent washout from all opioids 2 to 7 days before r and omization to 1 of 2 groups : active CR tramadol 200 mg every morning plus placebo IR tramadol 50 mg every 4 to 6 hours PRN rescue , or placebo CR tramadol 200 mg every morning plus active IR tramadol 50 mg every 4 to 6 hours PRN rescue . After 2 weeks , the doses were increased to CR tramadol 400 mg or placebo and IR tramadol 100 mg every 4 to 6 hours PRN or placebo , as rescue . After 4 weeks in the first phase , patients crossed over to the alternative treatment for another 4 weeks . Pain intensity ( 100-mm visual analog scale [ VAS ] and 5-point ordinal scales ) was assessed twice daily in diaries . Pain intensity , Pain and Disability Index ( PDI ; 0 - 10 ordinal scale ) , Pain and Sleep Question naire ( 100-mm VAS ) , and analgesic effectiveness ( 7-point ordinal scale ) were assessed at biweekly clinic visits . RESULTS Sixty-five patients ( 35 men , 30 women ) completed the study . Mean ( SD ) age was 56.5 ( 12.7 ) years ; mean ( SD ) weight was 82.0 ( 18.5 ) kg . Daily diary pain intensity ( mean [ SD ] ) was significantly lower in the CR tramadol group than in the IR tramadol group in the last 2 weeks of each phase ( completers : VAS , 29.9 [ 20.5 ] vs 36.2 [ 20.4 ] mm , P < 0.001 ; ordinal scale , 1.41 [ 0.7 ] vs 1.64 [ 0.6 ] , P < 0.001 ; intent-to-treat [ ITT ] population : VAS , 32.5 [ 22.9 ] vs 38.6 [ 21.2 ] mm , P < 0.003 ; ordinal scale , 1.50 [ 0.8 ] vs 1.72 [ 0.7 ] , P < 0.002 ) . The overall pain intensity scores from the daily diary were also significantly better with CR tramadol for both the completers and ITT . Similar results were obtained on the biweekly VAS pain intensity question naire . No differences were found between treatments in total PDI or overall Pain and Sleep scores in either population . For the completers , both patients and investigators rated effectiveness higher for CR tramadol than for IR tramadol ( P < 0.004 and P < 0.008 for patients and investigators , respectively ) . CONCLUSION This study reports significant improvement in pain intensity with CR tramadol as compared with IR tramadol Study Design . Double-blinded r and omized controlled trial . Objective . To test the short-term efficacy of a single intravenous ( IV ) pulse of glucocorticoids on the symptoms of acute discogenic sciatica . Summary of Background Data . The use of glucocorticoids in the treatment of acute discogenic sciatica is controversial . A potential advantage of the IV pulse therapy is the ability to distribute high glucocorticoid concentrations to the area surrounding the prolapsed disc without the risks and inconveniences of an epidural injection . Methods . Patients with acute sciatica ( < 6-week duration ) of radiologically confirmed discogenic origin were r and omized to receive either a single IV bolus of 500 mg of methylprednisolone or placebo . Clinical evaluation was performed in a double-blind manner on days 0 , 1 , 2 , 3 , 10 , and 30 . The primary outcome was reduction in sciatic leg pain during the first 3 days following the infusion ; secondary outcomes were reduction in low back pain , global pain , functional disability , and signs of radicular irritation . The analysis was performed on an intent-to-treat basis using a longitudinal regression model for repeated measures . Results . A total of 65 patients were r and omized , and 60 completed the treatment and the follow-up assessment s. A single IV bolus of glucocorticoids provided significant improvement in sciatic leg pain ( P = 0.04 ) within the first 3 days . However , the effect size was small , and the improvement did not persist . IV glucocorticoids had no effect on functional disability or clinical signs of radicular irritation . Conclusions . Although an IV bolus of glucocorticoids provides a short-term improvement in leg pain in patients with acute discogenic sciatica , its effects are transient and have small magnitude & NA ; The efficacy and safety of oral tolperisone hydrochloride ( Mydocalm ® ) in the treatment of painful reflex muscle spasm was assessed in a prospect i ve , r and omized , double‐blind , placebo‐controlled trial . A total of 138 patients , aged between 20 and 75 years , with painful reflex muscle spasm associated with diseases of the spinal column or proximal joints were enrolled in eight rehabilitation centers . Patients were r and omized to receive either 300 mg tolperisone hydrochloride or placebo for a period of 21 days . Both treatment groups recovered during the 3 weeks rehabilitation program . However , tolperisone hydrochloride proved to be significantly superior to placebo : the change score of the pressure pain threshold as the primary target parameter significantly increased during therapy with tolperisone hydrochloride ( P = 0.03 , valid‐case‐ analysis ) compared to the results obtained on placebo treatment . The overall assessment of efficacy by the patient also demonstrated significant differences in favor of tolperisone hydrochloride . Best results were seen in patients aged between 40 and 60 years with a history of complaints shorter than 1 year and with concomitant physical therapy . The evaluation of safety data , i.e. adverse events , biochemical and hematological laboratory parameters , demonstrated no differences between tolperisone hydrochloride and placebo . As a conclusion tolperisone hydrochloride represents an effective and safe treatment of painful reflex muscle spasm without the typical side effects of central ly active muscle relaxants BACKGROUND Although oral corticosteroids are commonly given to emergency department ( ED ) patients with musculoskeletal low back pain ( LBP ) , there is little evidence of benefit . OBJECTIVE To determine if a short course of oral corticosteroids benefits LBP ED patients . METHODS DESIGN R and omized , double-blind , placebo-controlled trial . SETTING Suburban New Jersey ED with 80,000 annual visits . PARTICIPANTS 18 - 55-year-olds with moderately severe musculoskeletal LBP from a bending or twisting injury ≤ 2 days prior to presentation . Exclusion criteria were suspected nonmusculoskeletal etiology , direct trauma , motor deficits , and local occupational medicine program visits . PROTOCOL At ED discharge , patients were r and omized to either 50 mg prednisone daily for 5 days or identical-appearing placebo . Patients were contacted after 5 days to assess pain on a 0 - 3 scale ( none , mild , moderate , severe ) as well as functional status . RESULTS The prednisone and placebo groups had similar demographics and initial and discharge ED pain scales . Of the 79 patients enrolled , 12 ( 15 % ) were lost to follow-up , leaving 32 and 35 patients in the prednisone and placebo arms , respectively . At follow-up , the two arms had similar pain on the 0 - 3 scale ( absolute difference 0.2 , 95 % confidence interval [ CI ] -0.2 , 0.6 ) and no statistically significant differences in resuming normal activities , returning to work , or days lost from work . More patients in the prednisone than in the placebo group sought additional medical treatment ( 40 % vs. 18 % , respectively , difference 22 % , 95 % CI 0 , 43 % ) . CONCLUSION We detected no benefit from oral corticosteroids in our ED patients with musculoskeletal LBP In a multicentre trial 456 selected patients with low back pain were r and omly allocated to one of four treatments-manipulation , definitive physiotherapy , corset , or analgesic tablets . Patients were reassessed clinical ly after three weeks ' treatment and again after a further three weeks . Question naires were used to find out the patients ' condition three months and one year after admission to the trial . There were never any important differences among the four groups of patients . A few patients responded well and quickly to manipulation , but there was no way of identifying such patients in advance . The response to a corset was slow , but the long-tern effects were at least as good as those of the other treatments . Patients treated only with analgesics fared marginally worse than those on the other three treatments . There is no strong reason , however , for recommending manipulation over physiotherapy or corset Thirty-nine patients with acute low back pain were treated with amitriptyline ( 150 mg/d ) or acetaminophen ( 2,000 mg/d ) in a controlled double-blind design for 5 weeks . Both groups revealed mild depression , normal coping , and increased anxiety at the beginning , with significant improvement in anxiety state and pain at the end of treatment . A repeated measures analysis of variance demonstrated that amitriptyline was more effective than acetaminophen in reducing pain intensity from the second week of treatment . Age and depression were the only significant pretreatment predictors of posttreatment pain . The study evaluates the significance of these findings Study Design . Open , r and omized , parallel group multicenter study . Objectives . To compare the efficacy and safety of transdermal fentanyl ( TDF ) and sustained release morphine ( SRM ) in strong-opioid naïve patients with chronic low back pain ( CLBP ) . Summary of Background Data . Most studies of TDF and SRM have involved patients already receiving strong opioids . This is the first large-scale study focusing on strong-opioid naïve patients with CLBP . Methods . Adults with CLBP requiring regular strong opioid therapy received either TDF or SRM for 13 months . Starting doses were 25 & mgr;g/hr fentanyl patches every 72 hours or 30 mg oral morphine every 12 hours . Doses were adjusted according to response . Participants assessed pain relief and bowel function using weekly diaries . Other assessment s , including quality of life , disease progression , and side effects , were made by patients and investigators . Results . Data from 680 patients showed that TDF and SRM provided similar levels of pain relief , but TDF was associated with significantly less constipation than SRM , indicating a greater likelihood of satisfactory pain relief without unmanageable constipation for patients receiving TDF . Other ratings were similar for TDF and SRM , but TDF provided greater relief of pain at rest and at night . Conclusions . TDF and SRM provided equivalent levels of pain relief , but TDF was associated with less constipation . This study indicates that sustained-release strong opioids can safely be used in strong-opioid naïve patients Background : Escitalopram has never been demonstrated to be useful in the treatment of chronic low back pain ( CLBP ) , while duloxetine has demonstrated analgesic effect in chronic pain states . The aim of this trial was to examine the efficacy of escitalopram for the treatment of CLBP compared with duloxetine . Methods : A total of 85 adult patients with non-radicular CLBP entered a 13-week r and omized study comparing escitalopram 20 mg with duloxetine 60 mg once daily . The primary measure was comparison of the two drugs on reduction in weekly mean 24-h average pain . Secondary measures included Clinical Global Impressions of Severity ( CGI-S ) and the 36-item Short-Form Health Survey ( SF-36 ) . Results : Eighty patients ( n = 39 escitalopram , n = 41 duloxetine ) completed the study . No significant differences existed between escitalopram and duloxetine on reduction in weekly mean 24-h average pain at end point . Both escitalopram and duloxetine demonstrated significant improvement on CGI-S and SF-36 . Conclusions : Escitalopram and duloxetine demonstrated efficacy and safety in the management of CLBP , with no significant differences . Results of this study should be replicated in a larger sample of patients Objectives : The Buprenorphine Transdermal Delivery System ( BTDS ) is indicated for reduction of pain in moderate to severe chronic low back pain ( CLBP ) , which can affect patients ’ ability to perform routine activities of daily living ( ADLs ) . This post hoc analysis of clinical trial data examines the impact of BTDS treatment on CLBP patients ’ ability to perform ADLs that relate to functioning with low back pain . Methods : Data are drawn from a multicenter , enriched enrollment , r and omized , placebo-controlled , double-blind 12-week trial of BTDS for pain control among opioid-naive patients with moderate to severe CLBP . The 23 selected ADLs are those that ( 1 ) appear in the Low Back Pain Core Set of the International Classification of Functioning , Disability and Health and ( 2 ) link to the content of 3 patient-reported outcome instruments administered during the trial . Logistic regression models estimated the odds ratios ( ORs ) of BTDS patients ’ ability to perform each ADL at 12 weeks , controlling for baseline ability , relative to placebo . Results : The ORs for 10 ADLs related to sleeping , lifting , bending , and working reached multiplicity-adjusted statistical significance and indicated a greater ability to perform ADLs among BTDS users than among the placebo group . These 10 ORs ranged from 1.9 ( no physical health-related restrictions on the kind of work performed ) to 2.4 ( being able to sleep undisturbed by pain ) . Discussion : These results suggest that for patients with moderate to severe CLBP , 12 weeks use of BTDS improves the ability to carry out certain ADLs related to sleeping , lifting , bending , and working Study Design . R and omized controlled study . Objectives . To investigate the efficacy of treatment with gabapentin on the clinical symptoms and findings in patients with lumbar spinal stenosis ( LSS ) . Summary of Background Data . LSS is a syndrome result ing from the narrowing of the lumbar nerve root canal , spinal canal , and intervertebral foramen , causing compression of the spinal cord . The most significant clinical symptom in patients with LSS is neurologic intermittent claudication ( NIC ) . Gabapentin , which has been used in the treatment of neuropathic pain , may be effective in the treatment of symptoms associated with LSS . Methods . Fifty-five patients with LSS , who had NIC as the primary complaint , were r and omized into 2 groups . All patients were treated with therapeutic exercises , lumbosacral corset with steel bracing , and nonsteroidal anti-inflammatory drugs . The treatment group received gabapentin orally in addition to the st and ard treatment . Results . Gabapentin treatment result ed in an increase in the walking distance better than what was obtained with st and ard treatment ( P = 0.001 ) . Gabapentin-treated patients also showed improvements in pain scores ( P = 0.006 ) and recovery of sensory deficit ( P = 0.04 ) , better than could be attained with the st and ard treatment . Conclusion . Based on the results of our pilot study , extensive clinical studies are warranted to investigate the role of gabapentin in the management of symptomatic LSS BACKGROUND This study evaluated the safety and efficacy of tramadol ER 300 mg and 200 mg versus placebo once daily in the treatment of chronic low back pain , using an open-label run-in followed by , without washout , a r and omized controlled study design . METHODS Adults with scores > or = 40 on a pain intensity visual analog scale ( VAS ; 0 = no pain ; 100 = extreme pain ) received open-label tramadol ER , initiated at 100 mg once daily and titrated to 300 mg once daily during a three-week open-label run-in . Patients completing run-in were r and omized to receive tramadol ER 300 mg , 200 mg , or placebo once daily for 12 weeks . RESULTS Of 619 patients enrolled , 233 ( 38 percent ) withdrew from the run-in , primarily because of adverse event ( n = 128 ) or lack of efficacy ( n = 41 ) . A total of 386 patients were then r and omized to receive either 300 mg ( n = 128 ) , 200 mg ( n = 129 ) , or placebo ( n = 129 ) . Following r and omization , mean scores for pain intensity VAS since the previous visit , averaged over the 12-week study period , increased more in the placebo group ( 12.2 mm ) than in the tramadol ER 300-mg ( 5.2 mm , p = 0.009 ) and 200-mg ( 7.8 mm , p = 0.052 ) groups . Secondary efficacy scores for current pain intensity VAS , patient global assessment , Rol and Disability Index , and overall sleep quality improved significantly ( p < or = 0.029 each ) in the tramadol ER groups compared with placebo . The most common adverse events during the double-blind period were nausea , constipation , headache , dizziness , insomnia , and diarrhea . CONCLUSIONS In patients who tolerated and obtained pain relief from tramadol ER , continuation of tramadol ER treatment for 12 weeks maintained pain relief more effectively than placebo . Adverse events were similar to those previously reported for tramadol ER BACKGROUND CONTEXT Treatment guidelines suggest that most acute low back pain ( LBP ) episodes substantially improve within a few weeks and that immediate use of imaging and aggressive therapies should be avoided . PURPOSE Assess the actual practice patterns of imaging , noninvasive therapy , medication use , and surgery in patients with LBP , and compare their costs to those of matched controls without LBP . STUDY DESIGN A retrospective analysis of cl aims data from 40 self-insured employers in the United States . PATIENT SAMPLE The study sample included 211,551 patients , aged 18 to 64 years , with one LBP diagnosis or more ( per Healthcare Effectiveness Data and Information Set specification ) during 2004 to 2006 , identified from a cl aims data base . Patients had continuous eligibility for 12 months or more after their index LBP diagnosis ( study period ) , for 6 months or more before their index diagnosis ( baseline period ) , and no other LBP diagnosis during the baseline period . Patients with LBP were matched to a r and om cohort of patients without LBP by age , gender , employment status , and index year . OUTCOMES MEASURES Physiological measures ( eg , imaging and diagnostic tests ) , functional measures ( eg , pharmacologic and nonpharmacologic treatment for LBP , health-care re source use ) , and direct ( medical and prescription drug ) and indirect ( disability and medically related absenteeism ) costs were assessed within the year after the LBP diagnosis . METHODS Univariate analyses described treatment patterns and compared baseline characteristics and study period costs . RESULTS Patients with LBP had significantly higher rates of baseline comorbidities and re source use compared with controls . Of patients with LBP , 41.6 % had imaging mean ( median ) [ st and ard deviation ] 34.3 ( 0 ) [ 78.6 ] days after the LBP diagnosis . Most patients with LBP ( 69.4 % ) used medications starting 51.9 ( 8) [ 86.2 ] days after the diagnosis . Opioids were commonly prescribed early ( 41.6 % of patients ; after 82.8 ( 25 ) [ 105.9 ] days ) . Of patients with LBP , 2.05 % had surgery during the study period . Patients with LBP were likely to have chiropractic treatment first , followed by pharmacotherapy with muscle relaxants and nonsteroidal anti-inflammatory drugs . Except for less surgery , these findings also held for patients with only nonspecific LBP . Patients with LBP had higher mean direct costs compared with controls ( $ 7,211 vs. $ 2,382 , respectively ; p<.0001 ) , with surgery patients having mean direct costs of $ 33,931 . CONCLUSIONS Contrary to clinical guidelines , many patients with LBP start incurring significant re source use and associated expenses soon after the index diagnosis . Achieving guideline -concordant care will require substantial changes in LBP practice patterns Study Design . This was a r and omized , double-blind , placebo-controlled clinical trial . Objective . To assess the efficacy and safety of duloxetine in the treatment of chronic low back pain ( CLBP ) . Summary of Background Data . Imbalance of serotonin and norepinephrine within modulatory pain pathways has been implicated in the development and maintenance of chronic pain . Duloxetine , a selective reuptake inhibitor of serotonin and norepinephrine , has demonstrated clinical efficacy in 3 distinct chronic pain conditions : diabetic peripheral neuropathic pain , fibromyalgia , and chronic pain because of osteoarthritis . Methods . In this r and omized double-blind trial , adult nondepressed patients with a non-neuropathic CLBP and a weekly mean of the 24-hour average pain score ≥4 at baseline ( 0–10 scale ) were treated with either duloxetine or placebo for 13 weeks . The dose of duloxetine during first 7 weeks was 60 mg once daily . At week 7 , patients reporting < 30 % pain reduction had their dose increased to 120 mg . The primary outcome measure was the Brief Pain Inventory ( BPI ) 24-hour average pain rating . Secondary measures included Rol and -Morris Disability Question naire-24 ; Patient 's Global Impressions of Improvement ; Clinical Global Impressions-Severity ( CGI-S ) ; BPI-Severity and -Interference ( BPI-I ) ; and weekly means of the 24-hour average pain , night pain , and worst pain scores from patient diaries . Quality -of-life , safety , and tolerability outcomes were also assessed . Results . Compared with placebo-treated patients ( least-squares mean change of −1.50 ) , patients on duloxetine ( least-squares mean change of −2.32 ) had a significantly greater reduction in the BPI 24-hour average pain from baseline to endpoint ( P = 0.004 at week 13 ) . Additionally , the duloxetine group significantly improved on Patient 's Global Impressions of Improvement ; Rol and -Morris Disability Question naire-24 ; BPI-Severity and average BPI-Interference ; weekly mean of the 24-hour average pain , night pain , and worst pain . Significantly more patients in the duloxetine group ( 13.9 % ) compared with placebo ( 5.8 % ) discontinued because of adverse events ( P = 0.047 ) . The most common treatment-emergent adverse events in the duloxetine group included nausea , dry mouth , fatigue , diarrhea , hyperhidrosis , dizziness , and constipation . Conclusion . Duloxetine significantly reduced pain and improved functioning in patients with CLBP . The safety and tolerability were similar to those reported in earlier studies Twenty-seven investigators participated in a double-blind , parallel placebo-controlled trial of piroxicam involving 278 patients with acute low back pain . Therapy commenced within 48 hours of the injury and continued for 7 days . The drug was given in the recommended regimen of 40 mg once daily for the first 2 days and 20 mg once daily thereafter . After 3 days of therapy , piroxicam patients showed a statistically greater amount of pain relief in the lying ( P<0.001 ) , sitting ( P<0.01 ) , and st and ing ( P<0.01 ) positions , but after 7 days the difference between treatments was no longer significant . After 1 week 's therapy , however , the requirement for additional analgesic was significantly lower in the piroxicam group ( P<0.05 ) , and more piroxicam than placebo patients ( 42 versus 28 ) had returned to work ( P<0.05 ) . Toleration was excellent in most patients , with only 13 % of the piroxicam and 17 % of the placebo group reporting adverse effects of mainly mild or moderate severity . The profile of the adverse effects was similar for both treatments . Piroxicam can provide effective relief of acute low-back pain with good toleration ; it should be considered for use in the initial treatment of this condition Study Design . Prospect i ve , r and omized , controlled trial . Objective . To investigate the effectiveness of home-based exercise on pain , dysfunction , and quality of life ( QOL ) in Japanese individuals with chronic low back pain ( CLBP ) . Summary of Background Data . Exercise therapy is a widely used treatment for CLBP in many countries . The studies on its effectiveness have been performed only in Western industrialized countries . The existence of cross-cultural differences and heterogeneity of patients in each country may influence the outcome of interventions for CLBP . Data that would enable research ers to compare the effectiveness of interventions between widely different societies is lacking . Methods . A total of 201 patients with nonspecific CLBP were r and omly assigned to either the control or exercise therapy group : 89 men and 112 women with a mean age of 42.2 years . The control group was treated with nonsteroidal anti-inflammatory drugs ( NSAIDs ) , and the exercise group performed trunk muscle strengthening and stretching exercises . The primary outcome measures were pain intensity ( visual analogue scale ) and dysfunction level ( Japan Low back pain Evaluation Question naire [ JLEQ ] and Rol and -Morris Disability Question naire [ RDQ ] ) over 12 months . The secondary outcome measure was FFD ( Finger-floor distance ) . Statistical analysis was performed using Wilcoxon signed-ranks and Mann-Whitney U tests , and estimation of the median with 95 % CI was calculated . Results . In both groups , significant improvement was found at all points of follow-up assessment . However , JLEQ and RDQ were significantly more improved in the exercise group compared to the control group ( P = 0.021 in JLEQ , P = 0.023 in RDQ ) . The 95 % CI for the difference of medians of the change ratio between exercise and NSAID groups , [ Exercise ] − [ NSAID ] , was −0.25 to −0.02 in JLEQ , −0.33 to 0.00 in RDQ , and −0.20 to 0.06 in visual analogue scale . Conclusion . The home-based exercise prescribed and monitored by board-certified orthopedic surgeons was more effective than NSAIDs for Japanese patients with CLBP We evaluated etoricoxib , a novel COX-2-specific inhibitor , in 319 patients with chronic low back pain ( LBP ) in this double-blind , placebo-controlled trial . Patients were r and omized to a 60 mg dose ( n = 103 ) or 90 mg dose ( n = 107 ) of etoricoxib , or placebo ( n = 109 ) , daily for 12 weeks . The primary endpoint was low back pain intensity scale ( Visual Analog Scale of 0- to 100-mm ) time-weighted average change from baseline over 4 weeks . Other endpoints included evaluation over 3 months of low back pain intensity scale , Rol and -Morris Disability Question naire ( RMDQ ) , low back pain bothersomeness scale , patient- and investigator-global assessment s , Patient Health Survey ( MOS SF-12 ) , rescue acetaminophen use , and discontinuation due to lack of efficacy . Etoricoxib provided significant improvement from baseline versus placebo in pain intensity ( 4 weeks : 12.9 mm and 10.3 mm for 60-mg and 90-mg doses , P < .001 for each ; 12 weeks : 10.5 mm and 7.5 mm for 60-mg and 90-mg doses , P = .001 and .018 , respectively ) . Etoricoxib at either dose led to significant improvement in other endpoints , including RMDQ scores , bothersomeness scores and global assessment s. Etoricoxib given once daily provided significant relief of symptoms , and disability associated with chronic LBP that was observed 1 week after initiating therapy , was maximal at 4 weeks , and was maintained over 3 months BACKGROUND For Canadian regulatory purpose s , an analgesic study was required to complement previously completed , pivotal studies on bowel effects and analgesia associated with controlled-release ( CR ) oxycodone⁄CR naloxone . OBJECTIVES To compare the analgesic efficacy and safety of CR oxycodone⁄CR naloxone versus placebo in patients with chronic low back pain . METHODS Patients requiring opioid therapy underwent a two- to seven-day opioid washout before being r and omly assigned to receive either 10 mg⁄5 mg CR oxycodone⁄CR naloxone or placebo every 12 h , titrated weekly according to efficacy and tolerability to 20 mg⁄10 mg , 30 mg⁄15 mg or 40 mg⁄20 mg every 12 h. After four weeks , patients crossed over to the alternative treatment for an additional four weeks . Acetaminophen⁄codeine ( 300 mg⁄30 mg every 4 h to 6 h as needed ) was provided as rescue medication . RESULTS Of the 83 r and omized patients , 54 ( 65 % ) comprised the per- protocol population . According to per- protocol analysis , CR oxycodone⁄CR naloxone result ed in significantly lower mean ( ± SD)pain scores measured on a visual analogue scale ( 48.6 ± 23.1 mm versus 55.9 ± 25.4 mm ; P=0.0296 ) and five-point ordinal pain intensity scores ( 2.1 ± 0.8 versus 2.4 ± 0.9 ; P=0.0415 ) compared with placebo . After the double-blinded phase , patients and investigators both preferred CR oxycodone⁄CR naloxone over placebo . These outcomes continued in the 79 % of patients who chose to continue receiving CR oxycodone⁄CR naloxone in a six-month , open-label evaluation . CONCLUSIONS In patients complying with treatment as per protocol , CR oxycodone⁄CR naloxone was effective for the management of chronic low back pain of moderate or severe intensity Study Design . A r and omized , open , long‐term , repeated‐dose comparison of an anti‐inflammatory drug and two opioid regimens in 36 patients with back pain . Objectives . To examine the long‐term safety and efficacy of chronic opioid therapy in a r and omized trial of patients with back pain . Methods . All participants underwent a 4‐week washout period of no opioid medication before being r and omly assigned to one of three treatment regimens for 16 weeks : 1 ) naproxen only , 2 ) set‐dose oxycodone , or 3 ) titrated‐dose oxycodone and sustained‐release morphine sulfate . All patients then were assigned to a titrated dose of opioids for 16 weeks and then gradually tapered off their medication for 12 weeks . Finally , all participants were monitored for a 1‐month posttreatment washout period . Each patient was called once a week for a report on pain , activity , mood , medication , hours awake , and adverse effects and was monitored carefully for signs of abuse and noncompliance . Results . Weekly reports during the experimental phase showed the titrated‐dose group to have less pain ( P < 0.001 ) and less emotional distress ( P < 0.001 ) than the other two groups . Both opioid groups were significantly different from the naproxen‐only group . During the titration phase , patients also reported significantly less pain and improved mood . Few differences were found in activity or hours asleep , or between average pretreatment and posttreatment phone‐interview and question naire variables . No adverse events occurred and only one participant showed signs of abuse behavior . Conclusions . The results suggest that opioid therapy has a positive effect on pain and mood but little effect on activity and sleep . Opioid therapy for chronic back pain was used without significant risk of abuse . However , tapered‐off opioid treatment is palliative and without long‐term benefit OBJECTIVE This r and omized , double-dummy , double-blind pilot study of acutely exacerbated low back pain was aim ed to inform a definitive comparison between Doloteffin , a proprietary extract of Harpagophytum , and rofecoxib , a selective inhibitor of cyclo-oxygenase-2 ( COX-2 ) . METHODS Forty-four patients ( phyto-anti-inflammatory drug-PAID-group ) received a daily dose of Doloteffin containing , inter alia , 60 mg of harpagoside for 6 weeks and 44 ( non-steroidal anti-inflammatory drug-NSAID-group ) received 12.5 mg/day of rofecoxib . All were allowed rescue medication of up to 400 mg/day of tramadol . Several outcome measures were examined at various intervals to obtain estimates of effect size and variability that might be used to decide the most suitable principal outcome measure and corresponding numbers required for a definitive study . RESULTS Forty-three PAID and 36 NSAID patients completed the study . Ten PAID and 5 NSAID patients reported no pain without rescue medication for at least 5 days of the 6th week of treatment . Eighteen PAID and 12 NSAID patients had more than a 50 % reduction in the week 's average of their pain scores between the 1st and 6th weeks . The mean percentage decrease from baseline in the pain component of the Arhus Index was 23 ( S.D. 52 ) in PAID and 26 ( S.D. 43 ) in NSAID . The corresponding measures for the overall Arhus Index were 11 ( 31 ) and 16 ( 24 ) and , for the Health Assessment Question naire , 7 ( 8) and 6 ( 7 ) . Tramadol was used by 21 PAID patients and 13 NSAID patients . Fourteen patients in each group experienced 39 adverse effects , of which 28 ( 13 in PAID ) were judged to some degree attributable to the study medications . CONCLUSION Though no significant intergroup differences were demonstrable , large numbers will be needed to show equivalence ABSTRACT Objective : Determine the efficacy and tolerability of oxymorphone extended release ( OPANA ER† ) in opioid-naive patients with moderate to severe chronic low back pain ( CLBP ) . Design and methods : Patients ≥ 18 years of age were titrated with oxymorphone ER ( 5- to 10‑mg increments every 12 h , every 3–7 days ) to a well-tolerated , stabilized dose . Patients were then r and omized to continue their oxymorphone ER dose or receive placebo every 12 h for 12 weeks . Oxymorphone immediate release was available every 4–6 h , as needed , for the first 4 days and twice daily thereafter . Results : Sixty-three percent of patients ( 205/325 ) were titrated to a stabilized dose of oxymorphone ER , most ( 203/205 ) within 1 month . During titration , 18 % discontinued from adverse events ( AEs ) and 1 % from lack of efficacy . For patients completing titration , average pain intensity decreased from 69.4 mm at screening to 22.7 mm ( p < 0.0001 ) . After r and omization , 68 % of oxymorphone ER and 47 % of placebo patients completed 12 weeks of double-blind treatment . Approximately 8 % of patients in each group discontinued because of AEs . Placebo patients discontinued significantly sooner from lack of efficacy than those receiving oxymorphone ER ( p < 0.0001 ) . Pain intensity increased significantly more in the placebo group ( least squares [ LS ] mean change 26.9 ± 2.4 [ median 28.0 ] ) than in the oxymorphone ER group ( LS mean change 10.0 ± 2.4 [ median 2.0 ] ; p < 0.0001 ) . Oxymorphone ER was generally well tolerated without unexpected AEs . Although limitations of a r and omized withdrawal study include the potential for unblinding and opioid withdrawal in placebo patients , opioid withdrawal was limited to two patients in the placebo group and one in the oxymorphone ER group . Conclusions : Stabilized doses of oxymorphone ER were generally safe and effective over a 12‑week double-blind treatment period in opioid-naive patients with CLBP To study the natural history of acute sciatica , 208 patients with obvious symptoms and signs of a lumbar radiculopathy ( L5 and S1 ) were examined within 14 days of onset . A concomitant double-blind investigation of the effect of the nonsteroidal anti-inflammatory drug piroxicam was performed . The results measured by visual analog scale and Rol and 's functional tests showed a satisfactory improvement throughout the 4 weeks of observation . The piroxicam-treated group had same results as the control group . Based on question naires at months 3 and 12 approximately 30 % of the patients still complained about back trouble and 19.5 % were out of work after 1 year . Four patients underwent surgery during this period The efficacy of an NSAID ( tenoxicam ) in the treatment of acute low back pain ( LBP ) was assessed in a double blind controlled study by using an objective functional evaluation . Seventy-three patients consulting for acute LBP were r and omized into two groups : Group I was treated with tenoxicam for 14 days and Group II was given a placebo . Trunk function was measured with a computerized isoinertial dynamometric trunk testing device ( Isostation B200 ) . Isometric and dynamic torques , range of motion and movement velocities were measured before treatment and after 14 days . Clinical evaluation was realized by the patient on a pain visual analogue scale ( VAS ) on days 1 , 8 and 15 and by the investigator on a five-point scale on days 8 and 15 . The functional evaluation showed significant differences in favour of the tenoxicam treatment for velocity and extension isometric torque . VAS and investigator evaluations showed a significant difference in favour of tenoxicam on day 8 but no difference on day 15 . This study shows that the use of tenoxicam in acute LBP is of interest . Tenoxicam has an effect on pain during the first part of the treatment and may help to restore full function even if the symptoms have disappeared PURPOSE To assess response to osteopathic manual treatment ( OMT ) according to baseline severity of chronic low back pain ( LBP ) . METHODS The OSTEOPATHIC Trial used a r and omized , double-blind , sham-controlled , 2 × 2 factorial design to study OMT for chronic LBP . A total of 269 ( 59 % ) patients reported low baseline pain severity ( LBPS ) ( < 50 mm/100 mm ) , whereas 186 ( 41 % ) patients reported high baseline pain severity ( HBPS ) ( ≥50 mm/100 mm ) . Six OMT sessions were provided over eight weeks and outcomes were assessed at week 12 . The primary outcome was substantial LBP improvement ( ≥50 % pain reduction ) . The Rol and -Morris Disability Question naire ( RMDQ ) and eight other secondary outcomes were also studied . Response ratios ( RRs ) and 95 % confidence intervals ( CIs ) were used in conjunction with Cochrane Back Review Group criteria to determine OMT effects . RESULTS There was a large effect size for OMT in providing substantial LBP improvement in patients with HBPS ( RR , 2.04 ; 95 % CI , 1.36 - 3.05 ; P<0.001 ) . This was accompanied by clinical ly important improvement in back-specific functioning on the RMDQ ( RR , 1.80 ; 95 % CI , 1.08 - 3.01 ; P=0.02 ) . Both RRs were significantly greater than those observed in patients with LBPS . Osteopathic manual treatment was consistently associated with benefits in all other secondary outcomes in patients with HBPS , although the statistical significance and clinical relevance of results varied . CONCLUSIONS The large effect size for OMT in providing substantial pain reduction in patients with chronic LBP of high severity was associated with clinical ly important improvement in back-specific functioning . Thus , OMT may be an attractive option in such patients before proceeding to more invasive and costly treatments BACKGROUND There is no head on comparison of amitriptyline ( AMT ) and pregabalin ( PG ) in relieving pain and disability in chronic low backache ( CLBA ) . This r and omized controlled trial reports the efficacy and safety of AMT and PG in CLBA . METHODS Patients with CLBA , 15 - 65 years of age without specific cause and significant neurological deficit were included . Severity of pain was assessed by Visual Analogue Scale ( VAS ) and disability by Oswestry Disability Index ( ODI ) . Patients were followed up at 6 and 14 weeks and their VAS score , ODI and side effect were noted . Primary outcome was pain relief ( > 50 % improvement in VAS score ) at 14 weeks and secondary outcome were reduction in ODI ( > 20 % ) and side effects . RESULTS 200 patients with CLBA were r and omized to AMT ( n=103 ) and PG ( n=97 ) using r and om numbers . The VAS score and ODI improved significantly following AMT and PG at 6 and 14 weeks compared to baseline . The improvement in pain ( 57.3 % Vs 39.2 % ; P=0.01 ) and disability ( 65 % Vs 49.5 % ; P=0.03 ) however was more in AMT group . The composite side effects were similar in both groups . CONCLUSION AMT and PG are effective in CLBA but AMT reduced pain and disability significantly compared to PG UNLABELLED This r and omized , double-blind , placebo-controlled study assessed efficacy and safety of duloxetine in patients with chronic low back pain ( CLBP ) . Adults ( n = 401 ) with a nonneuropathic CLBP and average pain intensity of ≥ 4 on an 11-point numerical scale ( Brief Pain Inventory [ BPI ] ) were treated with either duloxetine 60 mg once daily or placebo for 12 weeks . The primary measure was BPI average pain . Secondary endpoints included Patient 's Global Impressions of Improvement ( PGI-I ) , Rol and Morris Disability Question naire ( RMDQ-24 ) , BPI-Severity ( BPI-S ) , BPI-Interference ( BPI-I ) , and response rates ( either ≥ 30 % or ≥ 50 % BPI average pain reduction at endpoint ) . Health outcomes included Short Form-36 , European Quality of Life-5 Dimensions , and the Work Productivity and Activity Impairment question naire . Safety and tolerability were assessed . Compared with placebo-treated patients , duloxetine-treated patients reported a significantly greater reduction in BPI average pain ( P ≤ .001 ) . Similarly , duloxetine-treated patients reported significantly greater improvements in PGI-I , BPI-S , BPI-I , 50 % response rates , and some health outcomes . The RMDQ and 30 % response rate showed numerical improvements with duloxetine treatment . Significantly more patients in the duloxetine group ( 15.2 % ) than patients in the placebo group ( 5.4 % ) discontinued because of adverse events ( P = .002 ) . Nausea and dry mouth were the most common treatment-emergent adverse events with rates significantly higher in duloxetine-treated patients . PERSPECTIVE This study provides clinical evidence of the efficacy and safety of duloxetine at a fixed dose of 60 mg once daily in the treatment of chronic low back pain ( CLBP ) . As of December 2009 , duloxetine has not received regulatory approval for the treatment of CLBP OBJECTIVE To evaluate the efficacy and safety of tramadol in the treatment of chronic low back pain . METHODS A 3 phase trial : ( 1 ) a washout/screening phase ; ( 2 ) a 3 week , open label , run-in phase ; and ( 3 ) a 4 week , r and omized , placebo controlled , double blind treatment phase . Three hundred eighty out patients between 21 and 79 years with chronic low back pain with no or a distant history of back surgery enrolled in the open label phase and were treated with tramadol up to 400 mg/day . At the end of the open label phase , patients who tolerated tramadol and perceived benefit from it were r and omized to continue treatment with tramadol or to convert to placebo in the double blind phase . Reasons for discontinuing from the open label phase included adverse events , 78 patients ( 20.5 % ) ; drug ineffective , 23 patients ( 6.1 % ) ; and other reasons , 25 patients ( 6.6 % ) . Two hundred fifty-four patients entered the double blind phase , during which the daily dose was maintained within the range 200 - 400 mg tramadol or equivalent amount of placebo . The primary outcome measure in the double blind phase was the time to discontinuation due to inadequate pain relief . RESULTS The distribution of time to therapeutic failure was significantly ( p < or = 0.0001 ) different in the tramadol group compared to placebo . Kaplan-Meier estimate of the cumulative discontinuation rate due to therapeutic failure was 20.7 % in the tramadol group and 51.3 % in the placebo group . There were significantly lower ( p < or = 0.0001 ) mean pain visual analog scores ( 10 cm scale ) among tramadol patients ( 3.5 cm ) compared to placebo patients ( 5.1 cm ) at the final visit of the double blind phase . Tramadol patients scored significantly better on the McGill Pain Question naire ( p = 0.0007 ) and the Rol and Disability Question naire ( p = 0.0001 ) . Five of 127 tramadol treated patients and 6/127 placebo treated patients discontinued treatment during the double blind phase due to an adverse event . Commonly reported adverse events with tramadol included nausea , dizziness , somnolence , and headache . CONCLUSION Among patients who tolerated it well , tramadol was effective for the treatment of chronic low back pain In a prospect i ve double-blind study , we compared dexamethasone and placebo in 33 subjects with lumbosacral radicular pain . Of subjects with resting pain , 7/21 improved on dexamethasone , and 4/12 improved on placebo . Of subjects with pain on straight-leg raising , 8/19 improved on dexamethasone and 1/6 on placebo . Of 27 subjects evaluated 1 to 4 years after treatment , 8/16 who had received dexamethasone were asymptomatic or had only occasional mild low-back pain , compared with 7/11 who had received placebo . Thus , dexamethasone is not superior to placebo for either early or long-term relief of lumbosacral radicular pain , but may reduce pain evoked by stretch of acutely inflamed spinal nerve roots UNLABELLED Chronic lumbar radicular pain is the most common neuropathic pain syndrome . This was a double-blind , r and omized , 2-period crossover trial of topiramate ( 50 to 400 mg ) and diphenhydramine ( 6.25 to 50 mg ) as active placebo to assess the efficacy of topiramate . Each period consisted of a 4-week escalation , a 2-week maintenance at the highest tolerated dose , and a 2-week taper . Main outcome was the mean daily leg pain score on a 0 to 10 scale during the maintenance period . Global pain relief was assessed on a 6-level category scale . In the 29 of 42 patients who completed the study , topiramate reduced leg pain by a mean of 19 % ( P = .065 ) . Global pain relief scores were significantly better on topiramate ( P < .005 ) . Mean doses were topiramate 200 mg and diphenhydramine 40 mg . We concluded that topiramate treatment might reduce chronic sciatica in some patients but causes frequent side effects and dropouts . We would not recommend topiramate unless studies of alternative regimens showed a better therapeutic ratio . PERSPECTIVE The anticonvulsant topiramate might reduce chronic lumbar nerve root pain through effects such as blockade of voltage-gated sodium channels and AMPA/kainite glutamate receptors , modulation of voltage-gated calcium channels , and gamma-aminobutyric acid agonist-like effects CONTEXT This article presents the results of a pivotal Phase 3 study that assesses a new treatment for the management of chronic low back pain : a transdermal patch containing the opioid buprenorphine . In this r and omized , placebo-controlled study with an enriched enrollment design , the buprenorphine transdermal system ( BTDS ) was found to be efficacious and generally well tolerated . OBJECTIVES This enriched , multicenter , r and omized , double-blind study evaluated the efficacy , tolerability , and safety of BTDS in opioid-naïve patients who had moderate to severe chronic low back pain . METHODS Patients who tolerated and responded to BTDS ( 10 or 20 mcg/hour ) during an open-label run-in period were r and omized to continue BTDS 10 or 20 mcg/hour or receive matching placebo . The primary outcome was " average pain over the last 24 hours " at the end of the 12-week double-blind phase , collected on an 11-point scale ( 0=no pain , 10=pain as bad as you can imagine ) . Sleep disturbance ( Medical Outcomes Study subscale ) and total number of supplemental analgesic tablets used were secondary efficacy variables . RESULTS Fifty-three percent of patients receiving open-label BTDS ( 541 of 1024 ) were r and omized to receive BTDS ( n=257 ) or placebo ( n=284 ) . Patients receiving BTDS reported statistically significantly lower pain scores at Week 12 compared with placebo ( least square mean treatment difference : -0.58 , P=0.010 ) . Sensitivity analyses of the primary efficacy variable and results of the analysis of secondary efficacy variables supported the efficacy of BTDS relative to placebo . During the double-blind phase , the incidence of treatment-emergent adverse events was 55 % for the BTDS treatment group and 52 % for the placebo treatment group . Laboratory , vital sign , and electrocardiogram evaluations did not reveal unanticipated safety findings . CONCLUSION BTDS was efficacious in the treatment of opioid-naïve patients with moderate to severe chronic low back pain . Most treatment-emergent adverse events observed were consistent with those associated with the use of opioid agonists and transdermal patches Low back pain is the fifth most common reason for all physician visits in the United States ( 1 , 2 ) . Approximately one quarter of U.S. adults reported having low back pain lasting at least 1 whole day in the past 3 months ( 2 ) , and 7.6 % reported at least 1 episode of severe acute low back pain ( see Glossary ) within a 1-year period ( 3 ) . Low back pain is also very costly : Total incremental direct health care costs attributable to low back pain in the U.S. were estimated at $ 26.3 billion in 1998 ( 4 ) . In addition , indirect costs related to days lost from work are substantial , with approximately 2 % of the U.S. work force compensated for back injuries each year ( 5 ) . Many patients have self-limited episodes of acute low back pain and do not seek medical care ( 3 ) . Among those who do seek medical care , pain , disability , and return to work typically improve rapidly in the first month ( 6 ) . However , up to one third of patients report persistent back pain of at least moderate intensity 1 year after an acute episode , and 1 in 5 report substantial limitations in activity ( 7 ) . Approximately 5 % of the people with back pain disability account for 75 % of the costs associated with low back pain ( 8) . Many options are available for evaluation and management of low back pain . However , there has been little consensus , either within or between specialties , on appropriate clinical evaluation ( 9 ) and management ( 10 ) of low back pain . Numerous studies show unexplained , large variations in use of diagnostic tests and treatments ( 11 , 12 ) . Despite wide variations in practice , patients seem to experience broadly similar outcomes , although costs of care can differ substantially among and within specialties ( 13 , 14 ) . The purpose of this guideline is to present the available evidence for evaluation and management of acute and chronic low back pain ( see Glossary ) in primary care setting s. The target audience for this guideline is all clinicians caring for patients with low ( lumbar ) back pain of any duration , either with or without leg pain . The target patient population is adults with acute and chronic low back pain not associated with major trauma . Children or adolescents with low back pain ; pregnant women ; and patients with low back pain from sources outside the back ( nonspinal low back pain ) , fibromyalgia or other myofascial pain syndromes , and thoracic or cervical back pain are not included . These recommendations are based on a systematic evidence review summarized in 2 background papers by Chou and colleagues in this issue ( 15 , 16 ) from an evidence report by the American Pain Society ( 17 ) . The evidence report ( 17 ) discusses the evidence for the evaluation , and the 2 background papers ( 15 , 16 ) summarize the evidence for management . Methods The literature search for this guideline included studies from MEDLINE ( 1966 through November 2006 ) , the Cochrane Data base of Systematic Review s , the Cochrane Central Register of Controlled Trials , and EMBASE . The literature search included all English- language articles reporting on r and omized , controlled trials of nonpregnant adults ( age > 18 years ) with low back pain ( alone or with leg pain ) of any duration that evaluated a target medication and reported at least 1 of the following outcomes : back-specific function , generic health status , pain , work disability , or patient satisfaction . The American College of Physicians ( ACP ) and the American Pain Society ( APS ) convened a multidisciplinary panel of experts to develop the key questions and scope used to guide the evidence report , review its results , and formulate recommendations . The background papers by Chou and colleagues ( 15 , 16 ) provide details about the methods used for the systematic evidence review . This guideline grade s its recommendations by using the ACP 's clinical practice guidelines grading system , adapted from the classification developed by the Grading of Recommendations , Assessment , Development , and Evaluation ( GRADE ) work group ( Appendix Table 1 ) ( 18 ) . The evidence in this guideline was first evaluated by the ACP/APS panel by using a system adopted from the U.S. Preventive Services Task Force for grading strength of evidence , estimating magnitude of benefits , and assigning summary ratings ( Appendix Tables 2 , 3 , and 4 ) ( 19 ) . The evidence was independently review ed by the ACP 's Clinical Efficacy Assessment Subcommittee . The ratings for individual low back pain interventions discussed in this guideline are summarized in Appendix Table 5 for acute low back pain ( < 4 weeks ' duration ) and in Appendix Table 6 for chronic/subacute low back pain ( > 4 weeks ' duration ) . This guideline considered interventions to have proven benefits only when they were supported by at least fair- quality evidence and were associated with at least moderate benefits ( or small benefits but no significant harms , costs , or burdens ) . Figures 1 and 2 present an accompanying algorithm . Appendix Table 1 . The American College of Physicians Clinical Practice Guidelines Grading System Appendix Table 2 . Methods for Grading the Strength of the Overall Evidence for an Intervention Appendix Table 3 . Definitions for Estimating Magnitude of Effects Appendix Table 4 . Recommendations and Summary Ratings Appendix Table 5 . Level of Evidence and Summary Grade s for Noninvasive Interventions in Patients with Acute Low Back Pain Appendix Table 6 . Level of Evidence and Summary Grade s for Noninvasive Interventions in Patients with Chronic or Subacute Low Back Pain Figure 1 . Initial evaluation of low back pain ( LBP Figure 2 . Management of low back pain ( LBP Recommendations : Evaluation of Low Back Pain Recommendation 1 : Clinicians should conduct a focused history and physical examination to help place patients with low back pain into 1 of 3 broad categories : nonspecific low back pain , back pain potentially associated with radiculopathy or spinal stenosis , or back pain potentially associated with another specific spinal cause . The history should include assessment of psychosocial risk factors , which predict risk for chronic disabling back pain ( strong recommendation , moderate- quality evidence ) . More than 85 % of patients who present to primary care have low back pain that can not reliably be attributed to a specific disease or spinal abnormality ( nonspecific low back pain [ see Glossary ] ) ( 20 ) . Attempts to identify specific anatomical sources of low back pain in such patients have not been vali date d in rigorous studies , and classification schemes frequently conflict with one another ( 21 ) . Moreover , no evidence suggests that labeling most patients with low back pain by using specific anatomical diagnoses improves outcomes . In a minority of patients presenting for initial evaluation in a primary care setting , low back pain is caused by a specific disorder , such as cancer ( approximately 0.7 % of cases ) , compression fracture ( 4 % ) , or spinal infection ( 0.01 % ) ( 22 ) . Estimates for prevalence of ankylosing spondylitis in primary care patients range from 0.3 % ( 22 ) to 5 % ( 23 ) . Spinal stenosis ( see Glossary ) and symptomatic herniated disc ( see Glossary ) are present in about 3 % and 4 % of patients , respectively . The cauda equina syndrome ( see Glossary ) is most commonly associated with massive midline disc herniation but is rare , with an estimated prevalence of 0.04 % among patients with low back pain ( 24 ) . A practical approach to assessment is to do a focused history and physical examination to determine the likelihood of specific underlying conditions and measure the presence and level of neurologic involvement ( 24 , 25 ) . Such an approach facilitates classification of patients into 1 of 3 broad categories : nonspecific low back pain , back pain potentially associated with radiculopathy ( see Glossary ) or spinal stenosis ( suggested by the presence of sciatica [ see Glossary ] or pseudoclaudication ) , and back pain potentially associated with another specific spinal cause . The latter category includes the small proportion of patients with serious or progressive neurologic deficits or underlying conditions requiring prompt evaluation ( such as tumor , infection , or the cauda equina syndrome ) , as well as patients with other conditions that may respond to specific treatments ( such as ankylosing spondylitis or vertebral compression fracture ) . Diagnostic triage into 1 of these 3 categories helps guide subsequent decision making . Clinicians should inquire about the location of pain , frequency of symptoms , and duration of pain , as well as any history of previous symptoms , treatment , and response to treatment . The possibility of low back pain due to problems outside the back , such as pancreatitis , nephrolithiasis , or aortic aneurysm , or systemic illnesses , such as endocarditis or viral syndromes , should be considered . All patients should be evaluated for the presence of rapidly progressive or severe neurologic deficits , including motor deficits at more than 1 level , fecal incontinence , and bladder dysfunction . The most frequent finding in the cauda equina syndrome is urinary retention ( 90 % sensitivity ) ( 24 ) . In patients without urinary retention , the probability of the cauda equina syndrome is approximately 1 in 10000 . Clinicians should also ask about risk factors for cancer and infection . In a large , prospect i ve study from a primary care setting , a history of cancer ( positive likelihood ratio , 14.7 ) , unexplained weight loss ( positive likelihood ratio , 2.7 ) , failure to improve after 1 month ( positive likelihood ratio , 3.0 ) , and age older than 50 years ( positive likelihood ratio , 2.7 ) were each associated with a higher likelihood for cancer ( 26 ) . The posttest probability of cancer in patients presenting with back pain increases from approximately 0.7 % to 9 % in patients with a history of cancer ( not including nonmelanoma skin cancer ) . In patients with any 1 of the other 3 risk factors , the likelihood of cancer only increases to approximately 1.2 % ( 26 ) . Features predicting the presence of vertebral infection have not been well studied Study Design . R and omized , double-blind , placebo-controlled , single-dose crossover study . Objective . To test the analgesic efficacy of oxymorphone hydrochloride ( OH ) and propoxyphene/acetaminophen ( PA ) for patients with neurogenic claudication associated with lumbar spinal stenosis . Summary of Background Data . Although opioids are often prescribed for neurogenic claudication , no r and omized controlled studies support their efficacy for this condition . Patients with neurogenic claudication are generally excluded from clinical trials or included with patients who have nonspecific chronic low back pain , yielding a heterogeneous study population with very different pathophysiologies and clinical presentations . Methods . Participants received a single dose of each of the 3 treatments in r and om order . Treatments were separated by at least 3-day washout periods . The primary outcome variable was the time to first treadmill walking – induced moderate pain ( ≥4 out of 10 on a Numeric Rating Scale ) ( Tfirst ) assessed 90 minutes after treatment administration . Secondary outcome measures included patient global assessment of low back pain , Rol and -Morris Disability Question naire , Modified Brief Pain Inventory-Short Form , Oswestry Disability Index , and Swiss Spinal Stenosis Question naire . Results . The study was prematurely terminated because of the removal of PA from the US market . Twenty-four patients were r and omized ; 21 completed all 3 treatment periods . There were no significant differences among the treatment groups with respect to the median Tfirst ( OH — placebo : median [ 98.3 % confidence limits ] = −0.25 min [ −6.54 , 5.00 ] ; PA — placebo : 0.02 min [ −7.65 , 4.90 ] ; OH — PA : −0.27 min [ −5.56 , 6.66 ] ) . Conclusion This trial failed to demonstrate a benefit of OH or PA in patients experiencing neurogenic claudication . Considering the potential negative side effects of chronic opioid use , additional research is necessary to evaluate the efficacy of sustained opioid treatment specifically for neurogenic claudication . Level of Evidence : Two separate trials compared controlled-release ( CR ) oral oxycodone ( administered every 12 hours ) with immediate-release ( IR ) oxycodone ( 4 times a day ) to determine whether patients with chronic pain could be titrated to stable pain control as readily with the CR as with the IR formulation . In one study , 48 patients with cancer pain were r and omized to open-label titration with either CR or IR oxycodone ( maximum dose , 400 mg/day ) for a period of up to 21 days . In a study of similar design , 57 patients with low back pain were titrated with either CR or IR oxycodone ( maximum dose , 80 mg/day ) for a period of up to 10 days . The majority of patients in both studies were converted to oxycodone from other opioid analgesics . Results of both studies showed no difference between CR and IR oxycodone with respect to both the percentage of patients achieving stable pain control , the time to achieve stable pain control , and the degree of pain control achieved . Among cancer patients , 85 % achieved stable analgesia , 92 % with the CR formulation and 79 % with the IR formulation . Among noncancer patients , 91 % achieved stable pain control , 87 % with the CR formulation and 96 % with the IR formulation . The most commonly reported adverse effects in both studies were similar for the two formulations and were those anticipated with opioids : nausea , vomiting , constipation , somnolence , dizziness , and pruritus . Nausea and vomiting were the most frequently cited reasons for treatment discontinuations . These studies suggest that dose titration can be accomplished as readily with oral CR oxycodone as with IR oxycodone in patients with chronic , moderate to severe pain A total of 395 male infantry recruits were evaluated in a prospect i ve study of possible risk factors for overexertional back pain and the efficacy of drug treatment regimens for this syndrome . Recruits were classified into subgroups of lumbar or thoracic , and paraspinal or spinous process pain . Recruits were divided into three treatment groups : Ibuprofen , Paracetamol , and no drug treatment . Of the recruits , 18 % were diagnosed as having overexertional back pain during the course of 14 weeks of training . By multivariate analysis low body mass index was found to be a risk factor for overexertional lumbar pain ( p = 0.005 ) and increased lumbar lordosis a risk factor for overexertional thoracic pain ( p = 0.005 ) . Of recruits with overexertional back pain , 65 % were asymptomatic by the end of basic training . There was no statistically significant difference between cure rates according to treatment groups Study Design . A r and omized , double-blind , placebo-controlled trial of patients with radicular low back pain who present to an emergency department ( ED ) within 1 week of pain onset . Objective . We hypothesized that a single intramuscular 160 mg dose of methylprednisolone acetate would improve pain and functional outcomes 1 month after ED discharge if the corticosteroid were administered early in disease symptomotology . Summary of Background Data . Parenteral corticosteroids are not recommended for acute , radicular low back pain , though their role in this disease process is ill-defined . To date , this medication class has only been studied in a highly selected group of patients requiring hospitalization . Methods . Adults between the ages of 21 and 50 who presented to an ED with low back pain and a positive straight leg raise test were enrolled . The primary outcome was change in pain intensity on an 11 point numerical rating scale 1 month after ED visit . Secondary outcomes 1 month after ED discharge included analgesic use , functional disability , and adverse medication effects . Results . Six hundred thirty-seven patients were approached for participation , 133 were eligible , and 82 were r and omized . Baseline characteristics were comparable between the groups . The primary outcome , a comparison of the mean improvement in pain intensity , favored methylprednisolone by 1.3 ( P = 0.10 ) . Some secondary outcomes favored methylprednisolone , such as use of analgesic medication within the previous 24 hours ( 22 % vs. 43 % , 95 % CI for difference of 20 % : 0%–40 % ) and functional disability ( 19 % vs. 49 % , 95 % CI for difference of 29 % : 9%–49 % ) . Adverse medication effects 1 week after ED discharge were reported by 32 % of methylprednisolone and 24 % of placebo patients ( 95 % CI for difference of 9 % : −12 % to 30 % ) . Conclusion . This study was a negative study , though there was a suggestion of benefit of methylprednisolone acetate in a population of young adults with acute radicular low back pain . Further work with a larger sample of patients is needed OBJECTIVE To assess efficacy and safety of diclofenac-K 12.5 mg tablets in the treatment of acute low back pain ( low back pain ) . MATERIAL / METHOD A multiple dose , double-blind , double-dummy , r and omized , placebo-controlled , parallel group trial compared diclofenac-K ( 12.5 mg ; n = 124 ) with ibuprofen ( 200 mg ; n = 122 ) and placebo ( n = 126 ) in patients with moderate-to-severe acute low back pain . The treatment consisted of an initial dose of 2 tablets followed by 1 or 2 tablets every 4 - 6 hours as needed ( maximum 6 tablets/day ) for 7 days . The primary efficacy outcome for the initial dose was TOTPAR-3 , the summed total pain relief over the first 3 hours . Secondary initial dose outcomes included TOTPAR-6 , summed pain intensity differences SPID-3 and SPID-6 , time to rescue medication or remedicate , and the End of First Dose global efficacy assessment . The primary efficacy outcome for the flexible multiple dosing regimen was the End of Study global efficacy assessment . Secondary outcomes for multiple dosing included time to rescue medication over the entire study , the End of Day global efficacy assessment s ( daily over Days 1 - 7 ) , pain intensity differences on the VAS measured at Visit 2 and 3 , and change in Eifel algofunctional index . Safety/tolerability was assessed by recording adverse events . RESULTS Diclofenac-K 12.5 mg demonstrated superiority vs placebo on the primary efficacy parameter and almost all secondary initial dose outcomes . With respect to the initial dose , diclofenac-K 12.5 mg was also significantly superior to ibuprofen 200 mg on SPID-3 . Ibuprofen 200 mg was superior to placebo only on the End of First Dose global efficacy assessment . The flexible multiple dosing regimens of diclofenac-K and ibuprofen were both significantly superior to placebo on the End of Study global efficacy assessment , time to rescue medication over the entire study period , the End of Day global efficacy assessment on Days 1 - 2 , pain intensity difference on the VAS at Visit 3 and the Eifel algofunctional index at Visit 3 ( also at Visit 2 in diclofenac-K 12.5 mg group ) . Both active treatments were as well tolerated as placebo . CONCLUSIONS The flexible multiple dosing regimen of diclofenac-K 12.5 mg ( initial dose of 2 tablets followed by 1 - 2 tablets every 4 - 6 hours , max . 75 mg/day ) is an effective and safe treatment of acute low back pain The effect of antidepressant medication on chronic low-back pain patients was studied in a r and omized blind crossover study . Among those patients who completed the study , there was a 46 % decrease in the use of analgesics while on amitriptyline when compared to placebo ( P < 0.005 ) . There was also improvement in affect , but no measurable change in activity level . The MMPI profile of those patients who were unable to comply with the study protocol differed from that of patients who completed the study . The noncompliers demonstrated an elevation of the F , Pd , Pt ( P < 0.05 ) and Mf ( P < 0.01 ) scales . Although the interpretation of such a profile is left open to speculation , it may serve as an indicator of noncompliant individuals Abstract Objectives : Buprenorphine HCl buccal film has been developed for treating chronic pain utilizing BioErodible MucoAdhesive ( BEMA ® ) delivery technology . Buccal buprenorphine ( BBUP ; BelbucaTM , Endo Pharmaceuticals ) was evaluated for the management of moderate to severe chronic low back pain ( CLBP ) requiring around-the-clock analgesia in a multicenter , double-blind , placebo-controlled , enriched-enrollment , r and omized-withdrawal study in opioid-naive patients . Methods : Patients ( n = 749 ) were titrated to a dose of BBUP ( range , 150–450 µg every 12 h ) that was generally well tolerated and provided adequate analgesia for ≥14 days , and then r and omized to BBUP ( n = 229 ) or placebo ( n = 232 ) , respectively . The primary efficacy variable was the change from baseline to week 12 of double-blind treatment in the mean of daily average pain intensity scores ( numeric rating scale from 0 [ no pain ] to 10 [ worst pain imaginable ] ) . Results : Patients were experiencing moderate to severe pain at study entry : mean ( SD ) = 7.15 ( 1.05 ) . Following titration , pain was reduced to the mild range ; 2.81 ( 1.07 ) . After r and omization , mean ( SD ) pain scores increased from baseline to week 12 more with placebo ( 1.59 [ 2.04 ] ) versus BBUP : ( 0.94 [ 1.85 ] ) with a significant between-group difference ( −0.67 [ 95 % CI : −1.07 to −0.26 ] ; p = 0.0012 ) . A significantly larger percentage of patients receiving BBUP versus placebo had ≥30 % pain reduction ( 63 % vs 47 % ; p = 0.0012 ) . During double-blind treatment , the most frequent adverse events ( AEs ) with BBUP were nausea ( 10 % ) , constipation ( 4 % ) and vomiting ( 4 % ) . The most common AEs with placebo were nausea ( 7 % ) , upper respiratory tract infection ( 4 % ) , headache ( 3 % ) and diarrhea ( 3 % ) . Conclusions : These findings demonstrate the efficacy and tolerability of BBUP among opioid-naive patients requiring around-the-clock opioid treatment for CLBP BACKGROUND Duloxetine has demonstrated analgesic effect in chronic pain states . This study assesses the efficacy of duloxetine in chronic low back pain ( CLBP ) . METHODS Adult patients with non-radicular CLBP entered this 13-week , double-blind , r and omized study comparing duloxetine 20 , 60 or 120 mg once daily with placebo . The primary measure was comparison of duloxetine 60 mg with placebo on weekly mean 24-h average pain . Secondary measures included Rol and -Morris Disability Question naire ( RMDQ-24 ) , Patient 's Global Impressions of Improvement ( PGI-I ) , Brief Pain Inventory ( BPI ) , safety and tolerability . RESULTS Four hundred four patients were enrolled , 267 completed . No significant differences existed between any dose of duloxetine and placebo on reduction in weekly mean 24-h average pain at end-point . Duloxetine 60 mg was superior to placebo from weeks 3 - 11 in relieving pain , but not at weeks 12 - 13 . Duloxetine 60 mg demonstrated significant improvement on PGI-I , RMDQ-24 , BPI-average pain and BPI-average interference . Significantly more patients taking duloxetine 120 mg ( 24.1 % ) discontinued because of adverse events , versus placebo ( 8.5 % ) . CONCLUSIONS Duloxetine was superior to placebo on the primary objective from weeks 3 - 11 , but superiority was not maintained at end-point . Duloxetine was superior to placebo on many secondary measures , and was well-tolerated OBJECTIVE A single-agent , extended-release formulation of hydrocodone ( HC ) has been developed for treatment of chronic moderate-to-severe pain . This study was design ed to examine the safety and efficacy of HC extended release in opioid-experienced adults with moderate-to-severe chronic low back pain ( CLBP ) . METHODS This multicenter , enriched enrollment , r and omized withdrawal study comprised an open-label conversion/titration phase ( ≤6 weeks ) followed by placebo-controlled , double-blind treatment ( 12 weeks ) . During the conversion/titration phase , subjects ( N = 510 ) converted from their current opioid and were titrated to a stabilized dose of HC extended release ( 20 - 100 mg every 12 hours ) . During treatment , subjects ( N = 151 per group ) received HC extended release or placebo ; rescue medication was permitted . The primary efficacy end point was mean change in average pain intensity from baseline to day 85 . Response rates ( 30 % pain improvement ) and satisfaction ( Subject Global Assessment of Medication ) were assessed . RESULTS Demographic and baseline characteristics were similar between groups . Mean ± SD change in average pain intensity score from baseline to day 85 was significantly lower in the HC extended-release treatment group vs placebo ( 0.48 ± 1.56 vs 0.96 ± 1.55 ; P = 0.008 ) . Significantly more responders were in the treatment group ( 68 % vs 31 % ; P < 0.001 ) . Mean Subject Global Assessment of Medication scores increased significantly ( 0.8 ± 1.3 vs 0.0 ± 1.4 ; P < 0.0001 ) , indicating greater satisfaction with HC extended release . The adverse event profile was consistent with other opioids . CONCLUSIONS Extended-release HC is well tolerated and effective , without acetaminophen-associated risks of liver toxicity , for treatment of CLBP Objectives : This multicenter , r and omized , double-blind , placebo-controlled study with an enriched enrollment , r and omized withdrawal design was conducted to evaluate the analgesic efficacy and safety of single-entity , once-daily hydrocodone 20 to 120 mg tablets ( HYD ) in opioid-naive and opioid-experienced patients with uncontrolled moderate to severe chronic low back pain ( CLBP ) . Research design and methods : The primary endpoint was week 12 pain intensity scores ( 11-point scale , 0 = no pain ) using a mixed effect model with repeated measures incorporating a pattern mixture model framework . Responder analysis was a secondary endpoint . Safety was assessed . Results : Out of 905 patients who were treated with HYD during the open-label titration period , 588 ( 65 % ) were r and omized to continue to receive HYD ( n = 296 , 20 – 120 mg taken once daily , average daily dose 57 mg ) or a matching placebo ( n = 292 ) . HYD demonstrated superior pain reduction ( p = 0.0016 ) ; this result was supported by sensitivity analyses using different approaches to h and ling missing data . Proportions of patients achieving ≥ 30 and ≥ 50 % improvement in pain from screening to week 12 also favored HYD ( p = 0.0033 and 0.0225 , respectively ) . HYD was generally well tolerated . Conclusions : HYD was shown to be an efficacious treatment for CLBP in this study . There were no new or unexpected safety concerns detected
11,044
32,249,534
Roux-n-Y gastric bypass caused more alterations in gut microbial composition in comparison with sleeve gastrectomy .
Substantial differences in the response of gut microbial composition to metabolic and bariatric surgery have been reported . Therefore , the goal of the present review is to evaluate if method ological differences could be driving this lack of consistency .
BACKGROUND Changes in gut microbiota induced by bariatric surgery have been associated with metabolic benefits . OBJECTIVES Our aim was to identify specific gut microbiota that may contribute to the improvement of type 2 diabetes ( T2D ) after Roux-en-Y gastric bypass ( RYGB ) . SETTING Laboratories of Shanghai Diabetes Institute and Shanghai Sixth People 's Hospital . METHODS Diabetic rats induced via a high-fat diet and low-dose streptozotocin administration were r and omized to RYGB or sham surgery , and stool sample s were collected at baseline and at postoperative week 8 . The fecal microbiota was profiled using 16S ribosomal RNA gene sequencing . Additionally , we performed a case-control study of the gut microbial community profiles of T2D patients compared with those of healthy individuals via 16S ribosomal RNA gene sequencing of mucosal-luminal interface sample s collected from the ascending colon during colonoscopy . RESULTS RYGB significantly reduced the weight and improved glucose tolerance and insulin sensitivity in diabetic rats . Principal coordinate analysis showed that RYGB caused marked alterations in the gut microbiota . The RYGB group was postoperatively enriched for Bacteroidetes , Proteobacteria , Fusobacteria , and Actinobacteria , whereas the sham surgery group was enriched for Firmicutes and Verrucomicrobia . Based on the gut microbial patterns in the T2D patients , we found that the family Coriobacteriaceae within Actinobacteria might contribute to the beneficial effects of RYGB on T2D . CONCLUSIONS RYGB significantly improves glucose metabolism and alters the gut microbiota . Moreover , the family Coriobacteriaceae may partly mediate the beneficial effects of RYGB on T2D and thus possibly contribute to the development of novel bacteria-based therapeutic approaches Background and Objectives Bariatric surgery improves metabolic diseases and alters the intestinal microbiota in animals and humans , but different procedures reportedly have different impacts on the intestinal microbiota . We developed laparoscopic sleeve gastrectomy with duodenojejunal bypass ( LSG-DJB ) as an alternative to laparoscopic Roux-en-Y gastric bypass ( LRYGB ) in addition to laparoscopic sleeve gastrectomy ( LSG ) for Japanese patients with obesity . We investigated the precise change in the intestinal microbiota induced by these procedures in the present study . Methods A prospect i ve observational study of 44 Japanese patients with obesity was conducted [ 22 patients underwent LSG , 18 underwent LSG-DJB , and 4 underwent laparoscopic adjustable gastric b and ing ( LAGB ) ] . The patients ’ clinical parameters and intestinal microbiota were investigated before and for 6 months after surgery . The microbiota was analyzed by a 16S rDNA method . Results LSG and LSG-DJB significantly improved the metabolic disorders in the patients with obesity . The proportion of the phylum Bacteroidetes and order Lactobacillales increased significantly in the LSG group , and that of the order Enterobacteriales increased significantly in the LSG-DJB group . Conclusions LSG and LSG-DJB improved obesity and type 2 diabetes in Japanese patients with obesity , but the impact of LSG-DJB on the intestinal microbiota differed from that of LSG . This difference in the impact on the intestinal environment could explain the different efficacies of LSG and LSG-DJB in terms of their ability to resolve metabolic disorders in the clinical setting BACKGROUND Changes in the gut microbiome following bariatric surgery have been causally linked to metabolic benefits . OBJECTIVES We sought to characterize and assess the stability of gut microbiome shifts following sleeve gastrectomy ( SG ) . SETTING University laboratories . METHODS Diet-induced obese mice were r and omized to SG or sham surgery . Mice were housed individually or cohoused such that one SG mouse was housed with one weight-matched , sham-operated mouse . Fecal sample s were collected before and on postoperative days 7 and 28 . Bacterial composition in feces was characterized by using next-generation Illumina sequencing of 16 S rRNA . RESULTS SG mice lost more weight and were more insulin sensitive than sham mice independent of housing status ( P<.05 ) . One week following surgery , fecal sample s from all mice showed shifts in the microbiome that only persisted in SG-operated mice . Cohousing did not alter the microbial composition of SG-operated mice . Cohoused sham-operated mice showed a unique shift in microbial composition on postoperative day 28 that differed from individually housed , sham-operated mice ( P<.001 ) . Cohousing did not affect metabolic outcomes of either SG or sham surgeries . CONCLUSION SG results in acute and sustained shifts in the gut microbiome . SG associated shifts are not altered by reexposure to obesity-associated gut microbiota Background Roux-en-Y gastric bypass ( RYGB ) is an effective means to achieve sustained weight loss for morbidly obese individuals . Besides rapid weight reduction , patients achieve major improvements of insulin sensitivity and glucose homeostasis . Dysbiosis of gut microbiota has been associated with obesity and some of its co-morbidities , like type 2 diabetes , and major changes of gut microbial communities have been hypothesized to mediate part of the beneficial metabolic effects observed after RYGB . Here we describe changes in gut microbial taxonomic composition and functional potential following RYGB . Methods We recruited 13 morbidly obese patients who underwent RYGB , carefully phenotyped them , and had their gut microbiomes quantified before ( n = 13 ) and 3 months ( n = 12 ) and 12 months ( n = 8) after RYGB . Following shotgun metagenomic sequencing of the fecal microbial DNA purified from stools , we characterized the gut microbial composition at species and gene levels followed by functional annotation . Results In parallel with the weight loss and metabolic improvements , gut microbial diversity increased within the first 3 months after RYGB and remained high 1 year later . RYGB led to altered relative abundances of 31 species ( P < 0.05 , q < 0.15 ) within the first 3 months , including those of Escherichia coli , Klebsiella pneumoniae , Veillonella spp . , Streptococcus spp . , Alistipes spp . , and Akkermansia muciniphila . Sixteen of these species maintained their altered relative abundances during the following 9 months . Interestingly , Faecalibacterium prausnitzii was the only species that decreased in relative abundance . Fifty-three microbial functional modules increased their relative abundance between baseline and 3 months ( P < 0.05 , q < 0.17 ) . These functional changes included increased potential ( i ) to assimilate multiple energy sources using transporters and phosphotransferase systems , ( ii ) to use aerobic respiration , ( iii ) to shift from protein degradation to putrefaction , and ( iv ) to use amino acids and fatty acids as energy sources . Conclusions Within 3 months after morbidly obese individuals had undergone RYGB , their gut microbiota featured an increased diversity , an altered composition , an increased potential for oxygen tolerance , and an increased potential for microbial utilization of macro- and micro-nutrients . These changes were maintained for the first year post-RYGB.Trial registration Current controlled trials ( ID NCT00810823 , NCT01579981 , and NCT01993511 ) Background The role of the gut microbiome in arresting pathogen colonization and growth is important for protection against Clostridium difficile infection ( CDI ) . Observational studies associate proton pump inhibitor ( PPI ) use and CDI incidence . We hypothesized that PPI use affected the distal gut microbiome over time , an effect that would be best explored by time-longitudinal study of healthy subjects on PPI in comparison to treatment-naïve CDI subjects . This study enrolled nine healthy human subjects and five subjects with treatment-naïve CDI . After r and om assignment to a low ( 20 mg/day ) or high ( 2 × 20 mg/day ) dose group , fecal sample s were collected from the nine healthy subjects before , during , and after 28 days of PPI use . This was done in conjunction with pre-treatment fecal collection from CDI subjects . High-throughput sequencing ( 16S rRNA ) was performed on time-longitudinal sample s to assess changes to the healthy gut microbiome associated with prolonged PPI usage . The healthy sample s were then compared to the CDI subjects to explore changes over time to the gut microbiome associated with PPI use and potentially related to CDI . Results We report that PPI usage at low and high dosages , administered for 28 days , result ed in decreases to observed operational taxonomic unit ( OTU ) counts after both 1 week and 1 month . This decrease result ed in observed OTU levels that were similar to those found in treatment-naïve CDI patients , which was partly reversible after a 1 month recovery period . We did not detect a dose-dependent difference in OTU levels nor did we detect significant changes in taxa previously reported to be affected by PPI treatment . Conclusion While our observation of diminishing observed OTU counts during PPI therapy is a preliminary finding in a small cohort , our hypothesis that PPIs disrupt the healthy human gut microbiome is supported in this group . We conclude that decreases in observed species counts were reversible after cessation of PPI usage within 1 month . This finding may be a potential explanation for the association between prolonged PPI usage and CDI incidence Summary Bariatric surgery is currently the most effective procedure for the treatment of obesity . Given the role of the gut microbiota in regulating host metabolism and adiposity , we investigated the long-term effects of bariatric surgery on the microbiome of patients r and omized to Roux-en-Y gastric bypass or vertical b and ed gastroplasty and matched for weight and fat mass loss . The two surgical procedures induced similar and durable changes on the gut microbiome that were not dependent on body mass index and result ed in altered levels of fecal and circulating metabolites compared with obese controls . By colonizing germ-free mice with stools from the patients , we demonstrated that the surgically altered microbiota promoted reduced fat deposition in recipient mice . These mice also had a lower respiratory quotient , indicating decreased utilization of carbohydrates as fuel . Our results suggest that the gut microbiota may play a direct role in the reduction of adiposity observed after bariatric surgery Despite substantial research efforts , the mechanisms proposed to explain weight loss after gastric bypass ( RYGB ) and sleeve gastrectomy ( SL ) do not explain the large individual variation seen after these treatments . A complex set of factors are involved in the onset and development of obesity and these may also be relevant for the underst and ing of why success with treatments vary considerably between individuals . This calls for explanatory models that take into account not only biological determinants but also behavioral , affective and context ual factors . In this prospect i ve study , we recruited 47 women and 8 men , aged 25–56 years old , with a BMI of 45.8 ± 7.1 kg/m2 from the waiting list for RYGB and SL at Køge hospital , Denmark . Pre-surgery and 1.5 , 6 and 18 months after surgery we assessed various endpoints spanning multiple domains . Endpoints were selected on basis of previous studies and include : physiological measures : anthropometrics , vital signs , biochemical measures and appetite hormones , genetics , gut microbiota , appetite sensation , food and taste preferences , neural sensitivity , sensory perception and movement behaviors ; psychological measures : general psychiatric symptom-load , depression , eating disorders , ADHD , personality disorder , impulsivity , emotion regulation , attachment pattern , general self-efficacy , alexithymia , internalization of weight bias , addiction , quality of life and trauma ; and sociological and anthropological measures : sociodemographic measures , eating behavior , weight control practice s and psycho-social factors . Joining these many endpoints and method ologies from different scientific disciplines and creating a multi-dimensional predictive model has not previously been attempted . Data on the primary endpoint are expected to be published in 2018 . Trial registration Clinical trials . gov ID NCT02070081 Background : Probiotics are commonly used after bariatric surgery ; however , uncertainty remains regarding their efficacy . Our aim was to compare the effect of probiotics vs placebo on hepatic , inflammatory and clinical outcomes following laparoscopic sleeve gastrectomy ( LSG ) . Methods : This r and omized , double-blind , placebo-controlled , trial of 6-month treatment with probiotics ( Bio-25 ; Supherb ) vs placebo and 6 months of additional follow-up was conducted among 100 morbidly obese nonalcoholic fatty liver disease ( NAFLD ) patients who underwent LSG surgery . The primary outcome was a reduction in liver fat content , measured by abdominal ultrasound , and secondary outcomes were improvement of fibrosis , measured by shear-wave elastography , metabolic and inflammatory parameters , anthropometrics and quality of life ( QOL ) . Fecal sample s were collected and analyzed for microbial composition . Results : One hundred patients ( 60 % women , mean age of 41.9±9.8 years and body mass index of 42.3±4.7 kg m−2 ) were r and omized , 80 % attended the 6-month visit and 77 % completed the 12-month follow-up . Fat content and NAFLD remission rate were similarly reduced in the probiotics and placebo groups at 6 months postsurgery ( −0.9±0.5 vs −0.7±0.4 score ; P=0.059 and 52.5 vs 40 % ; P=0.262 , respectively ) and at 12 months postsurgery . Fibrosis , liver-enzymes , C-reactive protein ( CRP ) , leptin and cytokeratin-18 levels were significantly reduced and QOL significantly improved within groups ( P⩽0.014 for all ) , but not between groups ( P⩾0.173 for all ) at 6 and 12 months postsurgery . Within- sample microbiota diversity ( alpha-diversity ) increased at 6-month postsurgery compared with baseline in both study arms ( P⩽0.008 ) and decreased again at 12 months postsurgery compared with 6 months postsurgery ( P⩽0.004 ) but did not reach baseline values . Conclusions : Probiotics administration does not improve hepatic , inflammatory and clinical outcomes 6- and 12 months post-LSG Introduction Type 2 diabetes ( T2D ) in association with obesity is an increasing disease burden . Bariatric surgery is the only effective therapy for achieving remission of T2D among those with morbid obesity . It is unclear which of the two most commonly performed types of bariatric surgery , laparoscopic sleeve gastrectomy ( LSG ) and laparoscopic Roux-en-Y gastric bypass ( LRYGB ) , is most effective for obese patients with T2D . The primary objective of this study is to determine whether LSG or LRYGB is more effective in achieving HbA1c<6 % ( < 42 mmol/mol ) without the use of diabetes medication at 5 years . Methods and analysis Single-centre , double-blind ( assessor and patient ) , parallel , r and omised clinical trial ( RCT ) conducted in New Zeal and , targeting 106 patients . Eligibility criteria include age 20–55 years , T2D of at least 6 months duration and body mass index 35–65 kg/m2 for at least 5 years . R and omisation 1:1 to LSG or LRYGB , used r and om number codes disclosed to the operating surgeon after induction of anaesthesia . A st and ard medication adjustment schedule will be used during postoperative metabolic assessment s. Secondary outcomes include proportions achieving HbA1c<5.7 % ( 39 mmol/mol ) or HbA1c<6.5 % ( 48 mmol/mol ) without the use of diabetes medication , comparative weight loss , obesity-related comorbidity , operative complications , revision rate , mortality , quality of life , anxiety and depression scores . Exploratory outcomes include changes in satiety , gut hormone and gut microbiota to gain underlying mechanistic insights into T2D remission . Ethics and dissemination Ethics approval was obtained from the New Zeal and regional ethics committee ( NZ93405 ) who also provided independent safety monitoring of the trial . Study commenced in September 2011 . Recruitment completed in October 2014 . Data collection is ongoing . Results will be reported in manuscripts su bmi tted to peer- review ed journals and in presentations at national and international meetings . Trial registration numbers ACTRN12611000751976 , NCT01486680 ; Pre- results OBJECTIVE It is of vital importance to eluci date the triggering factors of obesity and type 2 diabetes to improve patient care . Bariatric surgery has been shown to prevent and even cure diabetes , but the mechanism is unknown . Elevated levels of lipopolysaccharide ( LPS ) predict incident diabetes , but the sources of LPS are not clarified . The objective of the current study was to evaluate the potential impact of plasma LPS on abdominal obesity and glycemic control in subjects undergoing bariatric surgery . RESEARCH DESIGN AND METHODS This was a prospect i ve observational study involving a consecutive sample of 49 obese subjects undergoing bariatric surgery and 17 controls . Main assessment s were plasma LPS , HbA1c , adipose tissue volumes ( computed tomography ) , and quantified bacterial DNA in adipose tissue compartments . RESULTS Plasma levels of LPS were elevated in obese individuals compared with controls ( P < 0.001 ) and were reduced after bariatric surgery ( P = 0.010 ) . LPS levels were closely correlated with HbA1c ( r = 0.56 ; P = 0.001 ) and intra-abdominal fat volumes ( r = 0.61 ; P < 0.001 ) , but only moderately correlated with subcutaneous fat volumes ( r = 0.33 ; P = 0.038 ) . Moreover , there was a decreasing gradient ( twofold ) in bacterial DNA levels going from mesenteric via omental to subcutaneous adipose tissue compartments ( P = 0.041 ) . Finally , reduced LPS levels after bariatric surgery were directly correlated with a reduction in HbA1c ( r = 0.85 ; P < 0.001 ) . CONCLUSIONS Our findings support a hypothesis of translocated gut bacteria as a potential trigger of obesity and diabetes , and suggest that the antidiabetic effects of bariatric surgery might be mechanistically linked to , and even the result of , a reduction in plasma levels of LPS Evidence suggests a correlation between the gut microbiota composition and weight loss caused by caloric restriction . Laparoscopic sleeve gastrectomy ( LSG ) , a surgical intervention for obesity , is classified as predominantly restrictive procedure . In this study we investigated functional weight loss mechanisms with regard to gut microbial changes and energy harvest induced by LSG and a very low calorie diet in ten obese subjects ( n = 5 per group ) demonstrating identical weight loss during a follow-up period of six months . For gut microbiome analysis next generation sequencing was performed and faeces were analyzed for targeted metabolomics . The energy-reabsorbing potential of the gut microbiota decreased following LSG , indicated by the Bacteroidetes/Firmicutes ratio , but increased during diet . Changes in butyrate-producing bacterial species were responsible for the Firmicutes changes in both groups . No alteration of faecal butyrate was observed , but the microbial capacity for butyrate fermentation decreased following LSG and increased following dietetic intervention . LSG result ed in enhanced faecal excretion of nonesterified fatty acids and bile acids . LSG , but not dietetic restriction , improved the obesity-associated gut microbiota composition towards a lean microbiome phenotype . Moreover , LSG increased malabsorption due to loss in energy-rich faecal substrates and impairment of bile acid circulation . This trial is registered with Clinical Trials.gov NCT01344525 Background It is unclear whether specific gut microbiota is associated with remission of type 2 diabetes ( T2D ) after distinct types of bariatric surgery . Aims The aim of this study is to examine gut microbiota changes after laparoscopic Roux-en-Y gastric bypass ( RYGB ) or sleeve gastrectomy ( SG ) surgery in obese patients with T2D . Methods Whole-metagenome shotgun sequencing of DNA fragments using Illumina HiSeq2000 was obtained from stool sample s collected from 14 obese T2D patients pre-operatively ( while on very low calorie diet ) and 1 year after r and omisation to laparoscopic SG ( n = 7 ) or RYGB ( n = 7 ) . Result ing shotgun reads were annotated with Kyoto Encyclopedia of Genes and Genomes ( KEGG ) . Results Body weight reduction and dietary change was similar 1 year after both surgery types . Identical proportions ( n = 5/7 ) achieved diabetes remission ( HbA1c < 48 mmol/mol without medications ) 1 year after RYGB and SG . RYGB result ed in increased Firmicutes and Actinobacteria phyla but decreased Bacteroidetes phyla . SG result ed in increased Bacteroidetes phyla . Only an increase in Roseburia species was observed among those achieving diabetes remission , common to both surgery types . KEGG Orthology and pathway analysis predicted contrasting and greater gut microbiota metabolism changes after diabetes remission following RYGB than after SG . Those with persistent diabetes post-operatively had higher Desulfovibrio species pre-operatively . Conclusions Overall , RYGB produces greater and more predicted favourable changes in gut microbiota functional capacity than SG . An increase in Roseburia species was the only compositional change common to both types of surgery among those achieving diabetes remission Background and Aims Small intestinal bacterial overgrowth ( SIBO ) has been described in obese patients . The aim of this study was to prospect ively evaluate the prevalence and consequences of SIBO in obese patients before and after bariatric surgery . Patients and Methods From October 2001 to July 2009 , in obese patients referred for bariatric surgery ( BMI > 40 kg/m2 or > 35 in association with comorbidities ) , a glucose hydrogen ( H2 ) breath test ( BT ) was performed before and /or after either Roux-en-Y gastric bypass ( RYGBP ) or adjustable gastric b and ing ( AGB ) to assess the presence of SIBO . Weight loss and serum vitamin concentrations were measured after bariatric surgery while a multivitamin supplement was systematic ally given . Results Three hundred seventy-eight ( mean ± SD ) patients who performed a BT before and /or after surgery were included : before surgery , BT was positive in 15.4 % ( 55/357 ) . After surgery , BT was positive in 10 % ( 2/20 ) of AGB and 40 % ( 26/65 ) of RYGBP ( p < 0.001 compared to preoperative situation ) . After RYGBP , patients with positive BT had similar vitamin levels , a lower caloric intake ( 983 ± 337 vs. 1271 ± 404 kcal/day , p = 0.014 ) but a significant lower weight loss ( 29.7 ± 5.6 vs. 37.7 ± 12.9 kg , p = 0.002 ) and lower percent of total weight loss ( 25.6 ± 6.0 vs. 29.2 ± 6.9 % , p = 0.044 ) . Conclusion In this study , SIBO is present in 15 % of obese patients before bariatric surgery . This prevalence does not increase after AGB while it rises up to 40 % of patients after RYGBP and it is associated with lower weight loss Background Bariatric surgery is known as one of the most effective treatments for sustainable weight loss ; however , it may be associated with some complications . This study was design ed to examine the effects of probiotic supplementation on some morbidities related to this surgery . Methods This was a placebo-controlled , double-blind , r and omized clinical trial on morbid obese patients referred for One Anastomosis Gastric Bypass- Mini Gastric Bypass ( OAGB-MGB ) surgery to a tertiary referral center . Patients were assigned to receive a probiotic supplement ( Familact ® ) or placebo from 4 weeks prior to surgery to 12 weeks after surgery . Anthropometric , biochemical , and inflammatory indices were evaluated at the beginning and the end of the study . Results At the end of study , significant improvements in some serum inflammatory markers , vitamin D status , and anthropometric measurements were observed ( p < 0.05 ) , which were significantly more in probiotic group rather than placebo group ( p < 0.05 ) . Moreover , significant improvements in glycemic indices and lipid profile were observed in both groups ; however , these changes were not significantly different between the groups . There was no significant difference in serum levels of vitamin B12 , folate , and homocysteine between groups at week 16 of the study . Discussion Our results indicate that probiotic supplementation promotes inflammatory markers , body weight loss , and status of vitamin D in patients undergoing OAGB-MGB bypass . Whether these findings will sustain in longer treatment duration remained to be eluci date d in future studies .Trial Registration This study has been registered at Clinical trial.gov with registration number NCT02708589 Background : Studies have shown that prebiotics and synbiotics modulate the intestinal microbiota and may have beneficial effects on the immune response and anthropometric indices ; however , the impact of the use of these supplements after bariatric surgery is not yet known . Goals : This study investigated the effects of prebiotic and synbiotic supplementation on inflammatory markers and anthropometric indices in individuals undergoing open Roux-en-Y gastric bypass ( RYGB ) . Study : In this r and omized , controlled , and triple-blind trial conducted as a pilot study , individuals undergoing RYGB ( n=9 ) and healthy individuals ( n=9 ) were supplemented with 6 g/d of placebo ( maltodextrin ) , prebiotic ( fructo-oligosaccharide , FOS ) , or synbiotic ( FOS+Lactobacillus and Bifidobacteria strains ) for 15 days . Results : Interleukin-1&bgr ; , interleukin-6 , tumor necrosis factor-&agr ; , C-reactive protein , albumin , and the C-reactive protein/albumin ratio showed no significant changes on comparison between groups after supplementation . The reduction in the body weight of patients undergoing RYGB was 53.8 % higher in the prebiotic group compared with the placebo group ( −0.7 kg , P=0.001 ) , whereas the reduction in the BMI and the increase in the percentage of excess weight loss were higher in the placebo and the prebiotic groups compared with the synbiotic group ( P<0.05 ) . Conclusions : Supplementation of FOS increased weight loss , whereas both prebiotics and synbiotics were not able to promote significant changes in inflammatory markers , although in most analyses , there was a reduction in their absolute values . The use of FOS may represent a potential adjunct in the treatment of obesity Background The effect of probiotic supplements among subjects undergoing bariatric surgery indicates conflicting results . Moreover , whether these effects remain after ceasing the treatment remained to be eluci date d. This study was conducted to assess the effect of probiotic supplements on blood markers of endotoxin ( lipopolysaccharides-binding protein : LBP ) , inflammation and lipid peroxidation ( malondialdehyde : MDA ) in patients with morbid obesity undergoing the one-anastomosis gastric bypass ( OAGB ) . Methods This study is a placebo-controlled , double-blind , and r and omized clinical trial and 9 months of additional follow-up . Forty-six morbid obese patients undergoing OAGB were r and omized to 4 months of probiotic or placebo supplements . Anthropometric indices and blood concentration of LBP , inflammatory markers , MDA , vitamin D3 , and B12 were measured at 0 , 4 , and 13 months of study . Results Probiotic supplements could improve serum LBP ( P = 0.039 ) , TNF-α ( P = 0.005 ) , vitamin B12 ( P = 0.03 ) , vitamin D3 ( P = 0.001 ) , and weight loss ( P = 0.01 ) at month 4 in comparison to placebo ; however , only serum MDA concentrations decreased significantly in the probiotic group compared with those in the placebo group ( P = 0.013 ) at the end of follow-up period . Discussion It was observed that 4 months probiotic supplementation compared with placebo prohibited an elevation in the LBP levels and improved serum TNF-α and 25-OH vitamin D3 concentrations and weight loss in patients undergoing the OAGB surgery . However , these effects did not persist 9 months after the cessation of the treatment . Further investigations are required to find how long supplementation and which dosage of it can benefit body status for the long-term . Trial Registration This study has been registered at Clinical trial.gov with registration number NCT02708589 Bariatric surgery ( BS ) success rates vary in the long-time . A better underst and ing of weight-loss response may help improve the outcomes of BS . Gut microbiome could be implicated in the successful rate of BS . The aim of the study is to analyze the role of gut microbiome in the successful rate of BS . This is a cross-sectional study of a prospect i ve cohort of 24 patients who underwent gastric bypass . Patients were classified based on excess weight loss ( EWL ) as : Success ( EWL50 % at nadir weight and throughout follow-up ) , Primary Failure ( EWL<50 % at nadir weight and thereafter ) , and Weight Regain ( EWL>50 % at nadir weight , but < 50 % at last follow-up visit ) . Gut microbiome analysis was assessed by High Throughput Sequencing . Cholesterol metabolism was shown as the most affected parameter among groups . Studied groups registered minor changes between their gut microbiome abundances , with Butyrivibrio , Lachnospira and Sarcina among them . However , Success group shared a more diverse core microbiome than the other groups . We showed evidence of a possible role of gut microbiome in the cholesterol metabolism , possibly through bile acids , relative to the success or failure of BS outcomes . Acinetobacter and Serratia , from Primary Failure core microbiome , could have implication s in its successful rate . Sarcina abundance was presented as the best genera related to the body mass index ( BMI ) post-surgery . Gut microbiota could mediate , at least partially , the success rate of BS through their interaction with the bile acids milieu . Further studies are necessary to vali date this probe of concept Gut microbiota likely impact obesity and metabolic diseases . We evaluated the changes in gut microbiota after surgical versus medical weight loss in adults with diabetes and obesity . We performed 16S rRNA amplicon sequencing to identify the gut microbial composition at baseline and at 10 % weight loss in adults with diabetes who were r and omized to medical weight loss ( MWL , n = 4 ) , adjustable gastric b and ing ( AGB , n = 4 ) , or Roux-en-Y gastric bypass ( RYGB , n = 4 ) . All participants were female , 75 % reported black race with mean age of 51 years . At similar weight loss amount and glycemic improvement , the RYGB group had the most number of bacterial species ( 10 increased , 1 decreased ) that significantly changed ( p < 0.05 ) in relative abundance . Alpha-diversity at follow-up was significantly lower in AGB group compared to MWL and RYGB ( observed species for AGB vs. MWL , p = 0.0093 ; AGB vs. RYGB , p = 0.0093 ) . The relative abundance of Faecalibacterium prausnitzii increased in 3 participants after RYGB , 1 after AGB , and 1 after MWL . At similar weight loss and glycemic improvement , the greatest alteration in gut microbiota occurred after RYGB with an increase in the potentially beneficial bacterium , F. prausnitzii . Gut microbial diversity tended to decrease after AGB and increase after RYGB and MWL . Future studies are needed to determine the impact and durability of gut microbial changes over time and their role in long-term metabolic improvement after bariatric surgery in adults with type 2 diabetes . NCTDK089557— Clinical Recently , the link between obesity and gut microbiota has become a focus for research . This study shed some light on the modification of postoperative gut microbial composition after bariatric surgery . A prospect i ve longitudinal study on healthy lean subjects and patients who underwent bariatric surgery ( Roux-en-Y gastric bypass and laparoscopic sleeve gastrectomy ) was carried out . Anthropometric and metabolic data , smoking , food preferences data , and stool sample s were collected from lean subjects and from obese patients before and 3 and 6 months after surgery ( T0 , T3 , and T6 , respectively ) . We collected stool sample s from 25 obese patients before surgery and 3 and 6 months thereafter and from 25 normal weight patients . After Roux-en-Y gastric bypass , Yokenella regensburgei ( p < 0.05 ) , Fusobacterium varium ( p < 0.05 ) , Veillonella dispar/atypica ( p < 0.05 ) , and Streptococcus australis/gordonii ( p < 0.05 ) were transiently identified in the gut at T3 . Roux-en-Y gastric bypass patients had a permanent increase in Akkermansia muciniphila ( p < 0.05 ) , which is associated with healthy metabolism , both at T3 and T6 . There were no significant changes in gut microbiota in laparoscopic sleeve gastrectomy patients . In our study , Roux-en-Y gastric bypass induced major microbial differences and greater weight loss compared with laparoscopic sleeve gastrectomy . Analyzing the microbiota composition , a proliferation of potential pathogens and the onset of beneficial bacteria was observed . The effects of these bacteria on human health are still far from clear . Underst and ing the mechanisms of action of these bacteria could be the keystone in developing new therapeutic strategies for obesity
11,045
32,315,453
At two years , low- quality evidence from two trials ( down grade d for bias and imprecision ) suggested there may be a small but clinical ly uncertain improvement in pain and function . Although it is an established procedure , no high- quality r and omised trials have been conducted to determine whether shoulder replacement might be more effective than other treatments for osteoarthritis or rotator cuff tear arthropathy of the shoulder . We remain uncertain about which type or technique of shoulder replacement surgery is most effective in different situations . When humeral hemiarthroplasty was compared to TSR surgery for osteoarthritis , low- quality evidence led to uncertainty about whether there is a clinical ly important benefit for patient-reported pain or function and suggested there may be little or no difference in quality of life . Evidence is insufficient to show whether TSR is associated with greater or less risk of harm than humeral hemiarthroplasty . Although reverse TSR is now the most commonly performed type of shoulder replacement , we found no studies comparing reverse TSR to any other type of treatment
BACKGROUND Shoulder replacement surgery is an established treatment for patients with end-stage glenohumeral osteoarthritis or rotator cuff tear arthropathy who have not improved with non-operative treatment . Different types of shoulder replacement are commonly used , but their relative benefits and risks compared versus one another and versus other treatments are uncertain . This exp and ed scope review is an up date of a Cochrane Review first published in 2010 . OBJECTIVES To determine the benefits and harms of shoulder replacement surgery in adults with osteoarthritis ( OA ) of the shoulder , including rotator cuff tear arthropathy ( RCT A ) .
BACKGROUND Inferior scapular notching following reverse shoulder arthroplasty is due to mechanical impingement and , in some studies , has been associated with poorer functional scores , lower patient satisfaction , and more limited shoulder motion . We aim ed to test the hypothesis that inferior positioning of the center of rotation with eccentric glenosphere design s decreases the adduction deficit before impingement occurs and improves clinical outcome . METHODS A r and omized , controlled , double-blinded trial was performed . According to the results of a power analysis , fifty patients undergoing reverse shoulder arthroplasty for the diagnosis of cuff tear arthropathy were r and omized intraoperatively to receive either a concentric or eccentric glenosphere . The glenoid baseplate was positioned flush to the inferior border of the glenoid before the glenosphere was then attached . Notching was assessed using an anteroposterior radiograph , and clinical outcome was assessed using the visual analog pain scale score , shoulder function rating , American Shoulder and Elbow Surgeons score , and Oxford shoulder score . Active forward elevation and external rotation were assessed . The outcome assessor was blinded to the treatment group . The mean follow-up period for the groups was forty-three and forty-seven months . RESULTS Patient demographics and preoperative scores were similar between the groups . At the time of the final follow-up , four patients ( 14.8 % ) in the concentric group had developed inferior scapular notching ( two with Nerot grade I and two with Nerot grade II ) , ranging in size from 1.1 to 7.4 mm , compared with one patient ( 4.3 % ; Nerot grade I ) in the eccentric group ( p = 0.36 ) . No notching occurred in any patient with glenoid overhang of > 3.5 mm . No significant difference between the groups was seen with respect to functional outcome scores , patient satisfaction , or shoulder motion . CONCLUSIONS There were no differences in notching rates or clinical outcomes between concentric and eccentric glenospheres following reverse shoulder arthroplasty . Inferior glenosphere overhang of > 3.5 mm , however , prevented notching . This may be achieved with a modified surgical technique , but eccentric glenospheres provide an additional option . LEVEL OF EVIDENCE Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence The choice is not clear cut and patients ’ expectations and preferences may affect the outcome In this issue Hay et al 1 report the results of a r and omised clinical trial evaluating the effectiveness of physiotherapy and corticosteroid injection for patients with unilateral shoulder pain . Shoulder pain is a common complaint ; estimates of the annual incidence in general practice vary from 6.6 to 25 cases per 1000 patients .2–4 Most patients are treated in primary care . If treatment with analgesics or non-steroidal anti-inflammatory drugs ( NSAIDs ) is not successful , patients with persistent symptoms are often referred for physiotherapy or treated with local infiltration of a corticosteroid.4 Until recently , evidence on the effectiveness of these interventions was scarce , particularly for primary care patients . Over the past few years two r and omised trials have been published that directly compared the effects of physiotherapy with corticosteroid injections.5,6 Both trials were carried out in Dutch general practice . The trial by Hay et al adds important and relevant information to this existing evidence . The trial is characterised by a thorough design , enrolled a relatively large number of patients , used relevant outcome measures , and achieved a nearly complete six months ’ follow up of participants . When examining the results of these three primary care trials , three issues arise for discussion . Firstly , the short term findings are rather different , with the Dutch trials clearly showing better effects of corticosteroid injections , whereas the English trial reports similar outcomes for the two interventions . Secondly , all three trials show minor and non-significant differences at long term follow up.1,6,7 Thirdly , the somewhat ambiguous overall evidence may leave substantial room for considering patient preferences and expectations when applying the results in clinical practice . Figure 1a shows the self reported change of symptoms at 5–7 weeks of follow up for BACKGROUND Glenoid component loosening remains a significant issue after anatomic shoulder arthroplasty . Pegged glenoid components have shown better lucency rates than keeled components in the short term ; however , midterm to long-term results have not fully been determined . We previously reported early outcomes of the current r and omized controlled group of patients , with higher glenoid lucency rates in those with a keeled glenoid . The purpose of this study was to evaluate the radiographic and clinical outcomes of these components at minimum 5-year follow-up . METHODS Fifty-nine total shoulder arthroplasties were performed in patients with primary glenohumeral osteoarthritis . Patients were r and omized to receive either a pegged or keeled glenoid component . Three raters grade d radiographic glenoid lucencies . Clinical outcome scores and active mobility outcomes were collected preoperatively and at yearly postoperative appointments . RESULTS Of the 46 shoulders meeting the inclusion criteria , 38 ( 82.6 % ) were available for minimum 5-year radiographic follow-up . After an average of 7.9 years , radiographic lucency was present in 100 % of pegged and 91 % of keeled components ( P = .617 ) . Grade 4 or 5 lucency was present in 44 % of pegged and 36 % of keeled components ( P = .743 ) . There were no differences in clinical outcome scores or active mobility outcomes between shoulders with pegged and keeled components at last follow-up . Within the initial cohort , 20 % of the keeled shoulders ( 6 of 30 ) and 7 % of the pegged shoulders ( 2 of 29 ) underwent revision surgery ( P = .263 ) . Kaplan-Meier analysis showed no significant difference in survival rates between groups ( P = .560 ) . CONCLUSION At an average 7.9-year follow-up , non-ingrowth , all-polyethylene pegged glenoid implants are equivalent to keeled implants with respect to radiolucency , clinical outcomes , and need for revision surgery BACKGROUND Presently , there are no approved nonoperative therapies for the ongoing treatment of persistent shoulder pain . Preliminary data suggest that intra-articular sodium hyaluronate injections may be beneficial for the treatment of persistent shoulder pain result ing from various etiologies . The present study evaluated the efficacy and safety of sodium hyaluronate ( Hyalgan ; molecular weight , 500 to 730 kDa ) for these patients . METHODS Six hundred and sixty patients with persistent shoulder pain and limitation result ing from glenohumeral joint osteoarthritis , rotator cuff tear , and /or adhesive capsulitis who had had a failure of conventional therapy were enrolled in this double-blind , r and omized , phosphate-buffered saline solution-controlled study , and 456 patients completed twenty-six weeks of follow-up . Patients were r and omized to receive either five weekly intra-articular injections of sodium hyaluronate , three weekly intra-articular injections of sodium hyaluronate followed by two weekly intra-articular injections of saline solution , or five weekly intra-articular injections of saline solution . The main outcomes were improvement in terms of shoulder pain on movement at thirteen weeks after the initiation of treatment ( as assessed with use of a 100-mm visual analog scale ) and the treatment effect throughout twenty-six weeks . RESULTS For the overall intent-to-treat population , patients who were managed with sodium hyaluronate had greater pain relief than controls did ; significant differences were noted at Week 7 ( for the five-injection hyaluronate group ) , Week 17 ( for the three and five-injection hyaluronate groups ) , and Week 26 ( for the three-injection hyaluronate group ) . Analysis of the stratified population s clearly established that this effect was due to benefits experienced by the patients with osteoarthritis . The treatment effect through twenty-six weeks was significant in patients with osteoarthritis in the three-injection ( p = 0.003 ) and five-injection ( p = 0.002 ) groups , with no significant difference for either regimen in patients without osteoarthritis . The safety profile was very favorable , with no product-related serious adverse effects and no between-group differences for any reported adverse event . CONCLUSIONS Although the primary end point of this study ( that is , improvement in terms of shoulder pain at thirteen weeks ) was not achieved , the overall findings , including secondary end points , indicate that sodium hyaluronate ( 500 to 730 kDa ) is effective and well tolerated for the treatment of osteoarthritis and persistent shoulder pain that is refractory to other st and ard nonoperative interventions BACKGROUND Controversy exists regarding the optimal technique of subscapularis mobilization during shoulder arthroplasty . The purpose of this study was to compare healing rates and subscapularis fatty infiltration in patients undergoing a lesser tuberosity osteotomy ( LTO ) versus subscapularis peel for exposure during shoulder arthroplasty . MATERIAL S AND METHODS Eighty-seven patients , with a mean age of 67.8 ± 10.9 years , undergoing shoulder arthroplasty , were r and omized to receive either an LTO ( n = 43 ) or peel ( n = 44 ) . Computed tomography scans were conducted preoperatively and at 12 months postoperatively . Outcome variables included healing rates and subscapularis Goutallier fatty infiltration grade , as well as subscapularis strength and Western Ontario Osteoarthritis of the Shoulder Index and American Shoulder and Elbow Surgeons outcome scores . RESULTS Computed tomography imaging was available in 91 % ( n = 79 ) of the cohort . The healing rates for the peel ( 100 % ) and for the LTO ( 95 % ) did not differ significantly ( P = .493 ) . Preoperatively , the mean fatty infiltration grade for the peel ( mean , 0.53 ) was not significantly different ( P = .925 ) from the LTO ( mean , 0.54 ) . Postoperatively , the Goutallier mean fatty infiltration grade for the peel ( mean , 0.95 ) did not differ significantly ( P = .803 ) from the LTO ( mean , 0.9 ) . A significant increase in subscapularis fatty infiltration grade occurred postoperatively from the preoperative status ( peel , P = .003 ; LTO , P = .0002 ) . No statistically significant associations were observed between postoperative fatty infiltration grade s and subscapularis strength , Western Ontario Osteoarthritis of the Shoulder Index scores , or American Shoulder and Elbow Surgeons scores . DISCUSSION No statistically significant differences were observed in the healing rates or subscapularis fatty infiltration grade s between the peel and the LTO . This trial does not show any clear difference in radiologic and clinical outcomes of one subscapularis management technique over the other BACKGROUND Both total shoulder arthroplasty and hemiarthroplasty have been used commonly to treat severe osteoarthritis of the shoulder ; however , their effect on disease-specific quality -of-life outcome is unknown . The purpose of this study was to compare the quality -of-life outcome following hemiarthroplasty with that following total shoulder arthroplasty in patients with osteoarthritis of the shoulder . METHODS Forty-two patients with a diagnosis of osteoarthritis of the shoulder were r and omized to receive a hemiarthroplasty or a total shoulder arthroplasty . One patient died , and all others were evaluated preoperatively and at six weeks and three , six , twelve , eighteen , and twenty-four months postoperatively with use of a st and ardized format including a disease-specific quality -of-life measurement tool ( Western Ontario Osteoarthritis of the Shoulder [ WOOS ] index ) , general shoulder rating scales ( University of California at Los Angeles [ UCLA ] shoulder scale , Constant score , and American Shoulder and Elbow Surgeons [ ASES ] evaluation form ) , general pain scales ( McGill pain score and visual analogue scale ) , and a global health measure ( Short Form-36 [ SF-36 ] ) . When a patient required revision of a hemiarthroplasty to a total shoulder arthroplasty , the last score before he or she " crossed over " was used for the analysis . RESULTS Significant improvements in disease-specific quality of life were seen two years after both the total shoulder arthroplasties and the hemiarthroplasties . There were no significant differences in quality of life ( WOOS score ) between the group treated with total shoulder arthroplasty and that treated with hemiarthroplasty ( 90.6 + /- 13.2 and 81.5 + /- 24.1 points , respectively ; p = 0.18 ) . The other outcome measures demonstrated similar findings . Two patients in the hemiarthroplasty group crossed over to the other group by undergoing a revision to a total shoulder arthroplasty because of glenoid arthrosis . CONCLUSIONS Both total shoulder arthroplasty and hemiarthroplasty improve disease-specific and general quality -of-life measurements . With the small number of patients in our study , we found no significant differences in these measurements between the two treatment groups . LEVEL OF EVIDENCE Therapeutic Level OBJECTIVES : To determine , in a population based study , the influence of occupational factors on the occurrence of shoulder pain and disability . METHODS : A r and om sample of patients was selected from the register of a general practice in the Greater Manchester area of the United Kingdom . Information was collected by a posted question naire with specific enquiries about symptoms in the shoulder region and related disability . A lifetime occupational history was obtained including physical exposures , working conditions , and psychosocial aspects of each workplace . Analysis has been conducted as a case-control study , comparing occupational exposures at the time of onset of symptoms in those with shoulder pain and disability with corresponding occupational exposures in those without shoulder pain and disability . RESULTS : An increased risk of shoulder pain and disability in men was associated with carrying weights on one shoulder ( relative risk ( RR ) 5.5 , 95 % confidence interval ( 95 % CI ) 1.8 to 17 ) , whereas those who reported working with h and s above shoulder level , using wrists or arms in a repetitive way , or stretching down to reach below knee level had about twice the risk of shoulder pain and disability . Men working frequently in very cold or damp conditions had a fourfold and sixfold risk respectively of shoulder pain and disability . Reporting of shoulder pain and disability was also more common among men and women who reported that their work caused a lot of stress ( RR 1.9 , 95 % CI 0.9 to 4.1 ) or was very monotonous ( RR 2.7 , 95 % CI 1.3 to 5.4 ) . The relations between physical exposures , working conditions , and psychosocial factors were independent . CONCLUSIONS : This population based study has shown that physical activities carried out at work , the physical conditions under which the work is conducted , psychosocial aspects of work , or the working environment are all independently related to the occurrence of shoulder symptoms and disability , emphasising the multifactorial nature of this condition HYPOTHESIS The correct implantation of the glenoid component is of paramount importance in total shoulder arthroplasty ( TSA ) . We hypothesized that the accuracy of the glenoid positioning in the transverse plane can be improved using intraoperative navigation . MATERIAL S AND METHODS This prospect i ve , r and omized clinical study comprised 2 groups of 10 patients each with osteoarthritis of the shoulder TSA , with or without intraoperative navigation . Glenoid version was measured on axial computed tomography scans preoperatively and 6 weeks postoperatively . RESULTS The operating time was significantly longer in the navigation group ( 169.5 + /- 15.2 vs 138 + /- 18.4 min ) . We found an average change of retroversion from 15.4 degrees + /- 5.8 degrees ( range , 3.0 degrees -24.0 degrees ) preoperatively to 3.7 degrees + /- 6.3 degrees ( range , -8.0 degrees to 15.0 degrees ) postoperatively in the navigation group compared with 14.4 degrees + /- 6.1 degrees ( range , 2.0 degrees -24.0 degrees ) preoperatively to 10.9 degrees + /- 6.8 degrees ( range , 0.0 degrees -19.0 degrees ) postoperatively in the group without navigation ( P = .021 ) . CONCLUSION We found an improved accuracy in glenoid positioning in the transverse plane using intraoperative navigation . The validity of the study is limited by the small number , which advocates continuation with more patients and longer follow-up . LEVEL OF EVIDENCE Level 2 ; Therapeutic study Glenoid loosening is one reason for failure of total shoulder arthroplasty . Several factors , including radiographic lucency , have been shown to be associated with glenoid loosening . The purpose of this study was to assess the correlation between glenoid design and immediate radiographic lucency in a prospect i ve r and omized clinical trial . Total shoulder arthroplasty was performed in 43 patients over a 2-year period . Twenty-three patients were r and omized into the keel group and twenty patients into the pegged group . Postoperative radiographs obtained within 6 weeks of surgery were evaluated by 3 raters to determine glenoid lucency . On a scale from 0 ( no lucency ) to 5 ( gross lucency and component loosening ) , the rate of lucency was 39 % ( 9/23 ) in the keeled components , which was significantly higher than the rate of 5 % ( 1/20 ) observed in the pegged components ( P = .026 ) . Patient age , gender , and glenoid size did not significantly affect glenoid component lucency ( P > .05 ) . The consistency reliability among raters ( Cronbach alpha ) was 0.87 , and the intertester reliability was 0.87 . Pegged glenoid components have less radiographic lucency when compared with keeled glenoid components in the immediate postoperative period Abstract Background Although several studies have been performed on the use of various devices in total shoulder arthroplasty ( TSA ) , no data are available in order to establish whether to prefer stemmed or stemless humeral components . Thus , the purpose of our study was to evaluate the short-term functional outcome in a cohort of subjects treated with TSA r and omized to treatment with stemmed or stemless prosthesis . Methods In this prospect i ve longitudinal study , we r and omized to treatment with stemmed ( group 1 ) or with stemless ( group 2 ) humeral component in nineteen subjects ( 2 M and 17 F ) diagnosed with humeral primary osteoarthritis with indication to TSA . We evaluated the range of movement of all the participants and the functional outcome using Constant score and simple shoulder test ( SST ) before and after 2 years from surgery . Results No differences were detected after 2 years from surgery in the two groups in terms of functional scores and range of motion ( p > 0.05 ) . ConclusionS temmed and stemless prostheses are comparable in terms of functional outcome . These data might be useful for the surgeon in order to choose more tissues-paring method ologies and less invasive procedures , such as stemless humeral implants Current knowledge of the clinical course and efficacy of treatment for shoulder pain comes mainly from studies of hospital patients . However , only a few patients experiencing such pain require referral to a specialist . Although shoulder pain is common in the general population , the outcome of patients presenting in general practice is unknown.1 We conducted a prospect i ve cohort study to determine the outcome of shoulder pain in primary care . Twelve general practitioners recruited 166 patients who consulted with a new episode of shoulder pain during one year . They recorded demographic information , diagnosis , management , and an assessment of passive elevation of the shoulder ; patients assessed the disability associated with their symptoms with a vali date d 22 item disability question naire.2 To assess outcome , identical disability question naires were sent to patients six and 18 months after consultation , together with a question measuring self assessed change in symptoms Purpose The aim of this study was to conduct a r and omised , clinical trial comparing stemmed hemiarthroplasty and resurfacing hemiarthroplasty in the treatment of glenohumeral osteoarthritis . Methods A total of 40 shoulders ( 35 patients ) were r and omised to stemmed hemiarthroplasty or resurfacing hemiarthroplasty and evaluated three and 12 months postoperatively using the Constant-Murley score ( CMS ) and Western Ontario Osteoarthritis of the Shoulder ( WOOS ) index . Results There were no statistically significant differences in age , gender or pre-operative scores except for WOOS at baseline . Two patients were lost to follow-up . Significant improvements in CMS and WOOS were observed at one year after both arthroplasty design s. At one year , the mean CMS was 48.9 ( range 6–80 ) after resurfacing hemiarthroplasty and 59.1 ( range 0–88 ) after stemmed hemiarthroplasty { mean difference 10.2 [ 95 % confidence interval ( CI ) −3.3 to 23.6 ] , P = 0.14}. The mean WOOS was 59.2 ( range 5.2–100.0 ) and 79.4 ( range 12.8–98.6 ) , respectively [ mean difference 20.2 ( 95 % CI 3.4–36.9 ) , P = 0.02 ] . No major complications occurred and there were no revisions . Conclusions The effects of resurfacing hemiarthroplasty tended to be inferior to those of stemmed hemiarthroplasty . It is unclear whether this reflects a real difference in effect or baseline differences due to the limited number of r and omised patients . We suggest there is a need for a larger , more definitive trial BACKGROUND Implant migration , bone mineral density ( BMD ) , length of glenohumeral offset ( LGHO ) , and clinical results were compared for the Copel and ( Biomet Inc , Warsaw , IN , USA ) and the Global C.A.P. ( DePuy Int , Warsaw , IN , USA ) humeral head resurfacing implants ( HHRIs ) . METHODS The study r and omly allocated 32 patients ( 13 women ) , mean age 63 years ( range , 39 - 82 years ) , with shoulder osteoarthritis to a Copel and ( n = 14 ) or Global C.A.P. ( n = 18 ) HHRI . Patients were monitored for 2 years with radiostereometry , dual-energy X-ray absorptiometry , Constant Shoulder Score ( CSS ) , and the Western Ontario Osteoarthritis of the Shoulder Index ( WOOS ) . LGHO was measured preoperatively and 6 months postoperatively . RESULTS At 2 years , total translation ( TT ) was 0.48 mm ( st and ard deviation [ SD ] , 0.21 mm ) for the Copel and and 0.82 mm ( SD , 0.46 mm ) for the Global C.A.P. ( P = .06 ) . Five HHRI were revised , and in the interval before the last follow-up ( revision or 2 years ) , TT of 0.58 mm ( SD , 0.61 mm ) for revised HHRI was higher ( P = .02 ) than TT of 0.22 mm ( SD , 0.17 mm ) in nonrevised HHRI . A comparison of TT at the last follow-up ( revision or 2 years ) found no difference between the HHRIs ( P = .12 ) . Periprosthetic BMD decreased initially but increased continuously after 6 months for both HHRIs . At 2 years , BMD was 48 % higher around the Copel and HHRI ( P = .005 ) . The mean difference in LGHO was significantly higher for the Copel and than for the Global C.A.P. HHRI ( P = .02 ) . Clinical results evaluated with CSS and WOOS improved over time for both implant groups ( P < .01 ) , with no differences between the groups . CONCLUSION Both implants had only little migration and good clinical results . Periprosthetic BMD and LGHO both increased for the Copel and HHRI more than for the Global C.A.P HHRI BACKGROUND Controversy exists regarding the optimal technique of subscapularis tendon mobilization during shoulder arthroplasty . The purpose of the present r and omized double-blind study was to compare two of these techniques-lesser tuberosity osteotomy and subscapularis peel-with regard to muscle strength and functional outcomes . METHODS Patients undergoing shoulder arthroplasty were r and omized to undergo either a lesser tuberosity osteotomy or a subscapularis peel . The primary outcome was subscapularis muscle strength as measured with an electronic h and held dynamometer at twenty-four months postoperatively . Secondary outcomes included the Western Ontario Osteoarthritis of the Shoulder Index and American Shoulder and Elbow Surgeons scores . A sample size calculation determined that eighty-six patients provided 90 % power with a 0.79 effect size to detect a significant difference between groups . RESULTS Forty-three patients were allocated to subscapularis osteotomy , and forty-four patients were allocated to subscapularis peel . Eighty-three percent of the study cohort returned for the twenty-four-month follow-up . The primary outcome of subscapularis muscle strength at twenty-four months revealed no significant difference ( p = 0.131 ) between the lesser tuberosity osteotomy group ( mean [ and st and ard deviation ] , 4.4 ± 2.9 kg ) and the subscapularis peel group ( mean , 5.5 ± 2.6 kg ) . Comparison of secondary outcomes , including the Western Ontario Osteoarthritis of the Shoulder Index and American Shoulder and Elbow Surgeons scores , demonstrated no significant differences between groups at any time point . Compared with baseline measures , mean subscapularis muscle strength , Western Ontario Osteoarthritis of the Shoulder Index score , and American Shoulder and Elbow Surgeons score all improved significantly in both groups at twenty-four months ( p < 0.001 ) . DISCUSSION No significant differences in the primary or secondary outcomes of function were identified between the lesser tuberosity osteotomy group and the subscapularis peel group . For the parameters investigated , this trial does not demonstrate any clear advantage of one subscapularis treatment technique over the other Background : The indications for resurfacing of the glenoid in patients who have osteoarthritis of the shoulder are not clearly defined ; some investigators routinely perform hemiarthroplasty whereas others perform total shoulder arthroplasty . Methods : Forty-seven patients ( fifty-one shoulders ) who were scheduled to have a shoulder arthroplasty for the treatment of degenerative osteoarthritis were r and omly assigned , according to a r and om-numbers table , to one of two groups : replacement of the humeral head with resurfacing of the glenoid with a polyethylene component with cement ( total shoulder arthroplasty [ twenty-seven shoulders ] ) or replacement of the humeral head without resurfacing of the glenoid ( hemiarthroplasty [ twenty-four shoulders ] ) . All patients received the same type of humeral component , and all operations were performed by or under the direct supervision of the same surgeon . The patients were followed for a mean of thirty-five months ( range , twenty-four to seventy-two months ) postoperatively . Evaluation was performed with use of the scoring systems of the University of California at Los Angeles and the American Shoulder and Elbow Surgeons . Results : No difference was observed between the preoperative scores for the two groups of patients . Postoperatively , the mean scores with use of the University of California at Los Angeles system and the American Shoulder and Elbow Surgeons system were 23.2 points ( range , 10 to 31 points ) and 65.2 points ( range , 15 to 94 points ) , respectively , after hemiarthroplasty and 27.4 points ( range , 9 to 34 points ) and 77.3 points ( range , 3 to 100 points ) , respectively , after total shoulder arthroplasty . With the numbers available for study , no significant difference was found between the two operative groups with respect to the postoperative score . ( Thirty-five subjects per group would be needed , assuming an effect size of 0.60 and a power of 0.80 . ) Total shoulder arthroplasty provided significantly greater pain relief ( p = 0.002 ) and internal rotation ( p = 0.003 ) than hemiarthroplasty did . Total shoulder arthroplasty also provided superior results in the specific areas of patient satisfaction , function , and strength , although none of these differences were found to be significant , with the numbers available . Total shoulder arthroplasty was associated with increased cost ( $ 1177 ) , operative time ( thirty-five minutes ) , and blood loss ( 150 milliliters ) per patient compared with hemiarthroplasty . To date , none of the total shoulder arthroplasties in the study group have been revised . Hemiarthroplasty yielded equivalent results for elevation and external rotation . Three of the twenty-five patients who had had a hemiarthroplasty needed a subsequent operation for resurfacing of the glenoid . The mean cost for the revision operations was $ 15,998 . Conclusions : Total shoulder arthroplasty provided superior pain relief compared with hemiarthroplasty in patients who had glenohumeral osteoarthritis , but it was associated with an increased cost of $ 1177 per patient BACKGROUND Glenoid component malposition for anatomic shoulder replacement may result in complications . The purpose of this study was to define the efficacy of a new surgical method to place the glenoid component . METHODS Thirty-one patients were r and omized for glenoid component placement with use of either novel three-dimensional computed tomographic scan planning software combined with patient-specific instrumentation ( the glenoid positioning system group ) , or conventional computed tomographic scan , preoperative planning , and surgical technique , utilizing instruments provided by the implant manufacturer ( the st and ard surgical group ) . The desired position of the component was determined preoperatively . Postoperatively , a computed tomographic scan was used to define and compare the actual implant location with the preoperative plan . RESULTS In the st and ard surgical group , the average preoperative glenoid retroversion was -11.3 ° ( range , -39 ° to 17 ° ) . In the glenoid positioning system group , the average glenoid retroversion was -14.8 ° ( range , -27 ° to 7 ° ) . When the st and ard surgical group was compared with the glenoid positioning system group , patient-specific instrumentation technology significantly decreased ( p < 0.05 ) the average deviation of implant position for inclination and medial-lateral offset . Overall , the average deviation in version was 6.9 ° in the st and ard surgical group and 4.3 ° in the glenoid positioning system group . The average deviation in inclination was 11.6 ° in the st and ard surgical group and 2.9 ° in the glenoid positioning system group . The greatest benefit of patient-specific instrumentation was observed in patients with retroversion in excess of 16 ° ; the average deviation was 10 ° in the st and ard surgical group and 1.2 ° in the glenoid positioning system group ( p < 0.001 ) . Preoperative planning and patient-specific instrumentation use result ed in a significant improvement in the selection and use of the optimal type of implant and a significant reduction in the frequency of malpositioned glenoid implants . CONCLUSIONS Novel three-dimensional preoperative planning , coupled with patient and implant-specific instrumentation , allows the surgeon to better define the preoperative pathology , select the optimal implant design and location , and then accurately execute the plan at the time of surgery BACKGROUND We compared hemiarthroplasty ( HA ) and total shoulder replacement ( TSR ) for the treatment of osteoarthritis at minimum of 10 years from primary arthroplasty . METHODS Thirty-three patients ( 13 HA and 20 TSR ) were intraoperatively r and omized to HA or TSR after glenoid exposure and were assessed to a minimum of 10 years postoperatively . Apart from those who died , no patients were lost to follow-up . RESULTS At 6 months and 1 year , the TSR patients had less pain than the HA patients ( P < .05 ) , and this became more apparent at 2 years postoperatively ( P < .02 ) . There were no statistically significant differences between the groups at 10 years with respect to pain , function , and daily activities . No patients in the HA group rated their shoulders as pain-free at 10 years ; however , 42 % of the surviving TSR patients rated their shoulders as pain-free at 10 years . Four HA patients were revised to TSR due to severe pain secondary to glenoid erosion . Two shoulders in the TSR group have been revised . Nine of the 13 HA patients ( 69 % ) and 18 of the 20 TSR patients ( 90 % ) remained in situ at death or at the 10-year review . CONCLUSION TSR has advantages over HA with respect to pain and function at 2 years , and there has not been a reversal of the outcomes on longer follow-up . This longer-term review does not support the contention that HA will avoid later TSR complications , and in particular , an unacceptable rate of glenoid component failure BACKGROUND Considerable interest has been focused on the design of the glenoid component used in total shoulder arthroplasty in order to reduce the risk of loosening . One design -related feature that has attracted attention is whether to use pegged or keeled cemented glenoid components . The main purpose of this study was to compare the fixation of cemented keeled glenoid components with that of cemented in-line pegged glenoid components . METHODS In a prospect i ve r and omized study , we compared the stability of cemented , all-polyethylene , keeled glenoid components and cemented , all-polyethylene , in-line three-pegged glenoid components by radiostereometric analysis . Twenty-seven shoulders in twenty-five patients with osteoarthritis ( twenty-two shoulders had primary and five shoulders had secondary osteoarthritis ) were included . There were sixteen women and nine men , and the mean age was sixty-four years . Radiostereometric analysis and conventional radiographs were carried out at five days , at four months , and at one and two years postoperatively . RESULTS The mean Constant and Murley score preoperatively and two years postoperatively was 25 and 70 , respectively , for shoulders with the keeled glenoid component and 22 and 70 for the shoulders with a pegged component . No significant difference was detected between groups with regard to the average micromigration of the glenoid components at any of the time points . The average translation was < 1 mm , while the median value was < 0.3 mm at two years , with no significant difference between the different axes . In five shoulders ( three with the keeled component and two with the pegged component ) , translation at two years was > 1 mm . In fourteen shoulders ( eight with the keeled and six with the pegged component ) , the rotation around one or several axes was > 2 degrees . We were not able to detect any specific pattern with regard to movement for either type of component nor were we able to detect any difference between the two types of components in the way they migrated , if migration occurred . CONCLUSIONS Cemented all-polyethylene keeled or in-line three-pegged glenoid components appear to have similar stability during the first two years after surgery . Studies with a longer follow-up period are needed to relate these findings to long-term clinical and radiographic outcomes Background This study aim ed to assess differences in the fixation and functional outcomes between pegged and keeled all-polyethylene glenoid components for st and ard total shoulder arthroplasty . Methods Patients were r and omized to receive a keeled or pegged all-polyethylene glenoid component . We used model-based radiostereometric analysis ( RSA ) to assess glenoid fixation and subjective outcome measures to assess patient function . Follow-up examinations were completed at 6 weeks and 6 , 12 and 24 months after surgery . Modifications to the RSA surgical , imaging and analytical techniques were required throughout the study to improve the viability of the data . Results Stymied enrolment result ed in only 16 patients being included in our analyses . The RSA data indicated statistically greater coronal plane migration in the keeled glenoid group than in the pegged group at 12 and 24 months . Functional outcome scores did not differ significantly between the groups at any follow-up . One patient with a keeled glenoid showed high component migration after 24 months and subsequently required revision surgery 7 years postoperatively . Conclusion Despite a small sample size , we found significant differences in migration between glenoid device design s. Although clinical ly these findings are not robust , we have shown the feasibility of RSA in total shoulder arthroplasty as well as the value of a high-precision metric to achieve objective results in a small group of patients HYPOTHESIS The purpose of this study was to determine if inferior tilt of the glenoid component decreased the amount of radiographic scapular notching after reverse shoulder arthroplasty . A secondary goal was to determine if inferior tilt had any effect on clinical outcome . MATERIAL S AND METHODS A prospect i ve r and omized trial of 52 consecutive reverse shoulder arthroplasties performed by 1 surgeon for cuff tear arthropathy was performed . The subjects were r and omly assigned to receive a glenoid component with no inferior tilt ( control group ) or a glenoid component that was inferiorly tilted 10 ° to protect the inferior glenoid ( inferior tilt group ) . All glenoid components were placed in 3 mm of inferior translation . Radiographic notching was grade d at a minimum of 1 year after surgery . Clinical outcomes of the groups were recorded . RESULTS Follow-up radiographs and data were available for 42 subjects , 20 in the inferior tilt group and 22 in the control group . The experimental groups did not differ significantly in the notch ratings or clinical outcomes . Notching occurred in 15 patients ( 75 % ) in the inferior tilt group and in 19 ( 86 % ) in the control group . Notching scores were 2 or greater in 10 patients ( 50 % ) in the inferior tilt group and in 11 ( 50 % ) in the control group . CONCLUSION Placing the glenoid component with inferior tilt does not reduce the incidence or severity of radiographic scapular notching after reverse shoulder arthroplasty . No clinical differences were observed between the groups This prospect i ve r and omized study compared the immediate postoperative periglenoid radiolucencies among 3 glenoid-drying techniques used in total shoulder arthroplasty . Seventy-one consecutive patients with primary osteoarthritis underwent total shoulder arthroplasty by use of 1 prosthetic system with convex-back , keeled , polyethylene glenoid components ; the same modern , instrumented pressurization technique was used to cement all glenoids . Of the shoulders , 21 had glenoid implants cemented after bony preparation with thrombin-soaked gel foam , 24 after compressed gas lavage , and 26 after saline solution lavage with sponge drying . The immediate postoperative anteroposterior radiographs were examined to evaluate the presence of periglenoid radiolucencies . Of the patients , 29 ( 41 % ) had radiolucencies evident immediately postoperatively , with all radiolucencies occurring in the faceplate zones . The mean total radiolucent line score was 0.63 ( P = .94 ) , with no significant difference among cementing preparation techniques ( P = .89 ) . Prosthetic mismatch did not differ among glenoid preparation techniques ( P = .86 ) . There was no statistical association between prosthetic mismatch and radiolucent line score either across ( P = .62 ) or within ( P = .99 ) the glenoid preparation groups . The associated costs in the gel foam group and compressed gas lavage group were 70 times higher than the cost in the saline solution lavage group . All radiolucencies were noted in the faceplate zones , with no radiolucency greater than 2 mm . Preparation of the glenoid surface for cementing showed no significant difference among the 3 techniques studied , although the material costs were significantly higher in the gel foam and compressed gas lavage groups compared with the saline solution lavage group BACKGROUND Traditional total shoulder arthroplasty ( TSA ) involves releasing the subscapularis tendon for exposure . This can potentially lead to subscapularis insufficiency , compromised function , and dissatisfaction . A novel TSA technique preserves the subscapularis tendon by performing the procedure entirely through the rotator interval , allowing accelerated rehabilitation . However , early reports on this approach have noted malpositioning of the humeral component and residual osteophytes . In a r and omized trial , we examined the incidence of humeral head malpositioning , incorrect sizing , and residual osteophytes on postoperative radiographs after subscapularis-sparing TSA compared with the traditional approach . METHODS Patients were prospect ively r and omized to undergo TSA performed through the traditional or subscapularis-sparing approach . The operating surgeon was blinded to the r and omization until the day of surgery . Anatomic reconstruction measurements included humeral head height , humeral head centering , humeral head medial offset , humeral head diameter ( HHD ) , and head-neck angle . Two independent review ers analyzed the postoperative radiographs to determine anatomic restoration of the humeral head and the presence of residual osteophytes . RESULTS We r and omized 96 patients to undergo either the st and ard approach ( n = 50 ) or the subscapularis-sparing approach ( n = 46 ) . There were no significant differences in humeral head height , humeral head centering , humeral head medial offset , HHD , head-neck angle , and anatomic reconstruction index between the 2 groups . However , significantly more postoperative osteophytes ( P = .0001 ) were noted in the subscapularis-sparing TSA group . Although the overall mean was not statistically different , further analysis of HHD showed that more patients in the subscapularis-sparing TSA group were outliers ( mismatch > 4 mm ) than in the traditional TSA group . CONCLUSIONS Although anatomic restoration of the shoulder can be accomplished using subscapularis-sparing TSA , retained osteophytes and significant mismatch of the HHD raise concerns regarding long-term outcomes Objective . The Outcome Measures in Rheumatology ( OMERACT ) Shoulder Core Outcome Set Special Interest Group ( SIG ) was established to develop a core outcome set ( COS ) for clinical trials of shoulder disorders . Methods . In preparation for OMERACT 2016 , we systematic ally examined all outcome domains and measurement instruments reported in 409 r and omized trials of interventions for shoulder disorders published between 1954 and 2015 . Informed by these data , we conducted an international Delphi consensus study including shoulder trial experts , clinicians , and patients to identify key domains that should be included in a shoulder disorder COS . Findings were discussed at a stakeholder premeeting of OMERACT . At OMERACT 2016 , we sought consensus on a preliminary core domain set and input into next steps . Results . There were 13 and 15 participants at the premeeting and the OMERACT 2016 SIG meeting , respectively ( 9 attended both meetings ) . Consensus was reached on a preliminary core domain set consisting of an inner core of 4 domains : pain , physical function/activity , global perceived effect , and adverse events including death . A middle core consisted of 3 domains : emotional well-being , sleep , and participation ( recreation and work ) . An outer core of research required to inform the final COS was also formulated . Conclusion . Our next steps are to ( 1 ) analyze whether participation ( recreation and work ) should be in the inner core , ( 2 ) conduct a third Delphi round to finalize definitions and wording of domains and reach final endorsement for the domains , and ( 3 ) determine which instruments fulfill the OMERACT criteria for measuring each domain In a prospect i ve , r and omized study between 2000 and 2004 , 20 patients with primary osteoarthritis of the shoulder had a total shoulder arthroplasty with radiostereometric analysis , 10 with keeled and 10 with pegged glenoid components . The relative movement of the glenoid component with respect to the scapula was measured over a 24-month period . Three keeled and five pegged glenoids needed reaming for erosion . The largest translations occurred along the longitudinal axis ( mean of 1.35 mm for keeled eroded components ) ( P = .017 for keeled vs pegged components and P = .013 for eroded vs non-eroded components ) . Both of the other translation axes showed no significant differences . The highest maximum total point movement at 24 months was 2.57 mm for keeled eroded components and 1.64 mm for pegged eroded components ( P = .029 for keeled vs pegged components and P = .023 for eroded vs non-eroded components ) . The largest rotation was anteversion , with mean values of 5.5 degrees for keeled eroded components and 4.8 degrees for pegged eroded components ( P = .658 for keeled vs pegged components and P = .90 for eroded vs non-eroded components ) . The mean varus tilt was 4.5 degrees for keeled eroded components compared with 2.3 degrees for pegged eroded components ( P = .004 for keeled vs pegged components and P = .016 for eroded vs non-eroded components ) , and finally , anterior-posterior rotation mean values were 3.5 degrees for keeled eroded components and 1.1 degrees for pegged eroded components ( P = .022 for keeled vs pegged components and P = .04 for eroded vs non-eroded components ) . In conclusion , whereas all components moved , radiostereometric analysis revealed increased migration with keeled components , exacerbated by glenoid erosion . Furthermore , a distinctive pattern of migration was identified over the 2-year period BACKGROUND Although cemented humeral fixation is recognized as the st and ard of care in total shoulder arthroplasty ( TSA ) , uncemented fixation has the potential to provide stable fixation , decrease operative time , and simplify potential revision procedures . This prospect i ve , r and omized , double-blind clinical trial compared cemented and uncemented humeral fixation in TSA for primary shoulder osteoarthritis . METHODS Patients with primary shoulder osteoarthritis requiring replacement were screened for eligibility . After providing informed consent , subjects received baseline clinical and radiologic assessment s , computed tomography scans , and st and ardized TSA . After glenoid component insertion , patients were r and omized to either a cemented or uncemented humeral component . The primary outcome was the WOOS ( Western Ontario Arthritis of the Shoulder Index ) score at 2 years . Other outcomes included the Short Form 12 score , American Shoulder and Elbow Surgeons score , McMaster-Toronto Arthritis Patient Preference Disability Question naire , operative time , complications , and revisions . Patients were assessed by a blinded evaluator at 2 and 6 weeks and 3 , 6 , 12 , 18 , and 24 months postoperatively . RESULTS In total , 161 patients consented to be included and were r and omized : 80 in the cemented group and 81 in the uncemented group . There were no significant differences in demographics or baseline evaluations between groups , except for gender . The 12- , 18- , and 24-month WOOS scores showed a significant difference in favor of the cemented group . The cemented group also had better strength and forward flexion . As expected , the operative time was significantly less for the uncemented group . CONCLUSIONS These findings provide level I evidence that cemented fixation of the humeral component provides better quality of life , strength , and range of motion than uncemented fixation BACKGROUND Modern cementing techniques have improved glenoid fixation , reduced glenoid lucency seen with keeled components , and may eliminate differences attributable to glenoid design . The purpose of this study was to determine the effect of glenoid design on immediate and follow-up radiographic lucency of pegged and keeled glenoid components , using modern cementing techniques . MATERIAL AND METHODS Fifty-three total shoulder arthroplasties were performed in patients with primary glenohumeral osteoarthritis . Patients were r and omized prospect ively to receive either a pegged or keeled glenoid component . Three raters grade d radiographic glenoid lucencies . RESULTS On immediate radiographs , there was no significant difference in the rate of glenoid lucency between pegged ( 0 % ) and keeled ( 15 % ) glenoid components ( P = .128 ) . However , after an average of 26 months , the rate of glenoid lucency was significantly higher in patients with keeled components ( 46 % ) compared to patients with pegged components ( 15 % ) ( P = .003 ) . CONCLUSION Even with modern cementing techniques , pegged glenoid components remain radiographically superior to keeled glenoid components Although total shoulder arthroplasty ( TSA ) is generally associated with good to excellent outcomes in most patients , the integrity and function of the subscapularis tendon ( SSC ) is of paramount importance because SSC rupture after TSA can lead to inferior outcomes . Therefore , the efficacy of a SSC-sparing TSA procedure was evaluated through a prospect i ve , double-blinded , r and omized study . Patients with end-stage osteoarthritis of the shoulder were r and omized into 2 groups . Group 1 patients were treated with TSA in which the prosthesis was inserted entirely through the rotator interval without violating the SSC tendon ( SPARING ) . Group 2 patients were treated with TSA using the SSC tenotomy approach ( ST AND ARD ) . Both the patients and the evaluators remained blinded to the surgical approach throughout the study . Outcome data collected included the visual analog scale score for pain and the American Shoulder and Elbow Surgeons outcome score . Complete 2-year outcome data were collected from 32 SPARING and 38 ST AND ARD patients at a mean follow-up of 31.1 and 33.4 months , respectively . The American Shoulder and Elbow Surgeons and visual analog scale scores improved significantly for both groups . Differences between groups did not reach statistical significance . Complication profiles were similar for the 2 groups , with 3 patients in the SPARING group and 2 patients in the ST AND ARD group requiring revision surgery during the study . At short-term follow-up , the outcome of TSA using the SSC-sparing surgical approach was similar to the outcome of TSA using the st and ard approach . Studies with longer follow-up are required to document the potential benefits of this surgical technique . [ Orthopedics . 2019 ; 42(1):e61-e67 . ] BACKGROUND The purpose of this r and omized controlled trial was to compare humeral inclinations of 135 ° and 155 ° in patients undergoing primary reverse shoulder arthroplasty ( RSA ) . Our hypothesis was that forward flexion would be higher in the 155 ° group but be associated with a higher rate of scapular notching . METHODS A r and omized controlled trial was conducted on 100 primary RSAs performed with a humeral inclination of either 135 ° or 155 ° . The prostheses were otherwise identical and a neutral glenosphere was used in all cases . Functional outcome , forward flexion , external rotation , and scapular notching were assessed at a minimum of 2 years postoperatively . RESULTS There was no difference in range of motion or functional outcome scores between the 2 groups . In the 155 ° group , forward flexion improved from 76 ° to 135 ° ( P < .001 ) and external rotation remained unchanged ( 29 ° vs. 30 ° ; P = .835 ) . In the 135 ° group , postoperative forward flexion improved from 78 ° to 132 ° ( P < .001 ) and external rotation was unchanged ( 28 ° vs. 29 ° ; P = .814 ) . Scapular notching was observed in 58 % of cases with a 155 ° inclination compared with 21 % with a 135 ° inclination ( P = .009 ) . CONCLUSION With a neutral glenosphere there was no difference in postoperative forward flexion or external rotation after an RSA with a humeral inclination of 135 ° compared with 155 ° . Scapular notching was reduced with the use of 135 ° design compared with a 155 ° design but persists at a rate of 21 % at 2-year follow-up in the absence of a lateralized glenosphere BACKGROUND Preoperative quantitative assessment of glenoid bone loss , selection of the glenoid component , and definition of its desired location can be challenging . Placement of the glenoid component in the desired location at the time of surgery is difficult , especially with severe glenoid pathological conditions . METHODS Forty-six patients were r and omly assigned to three-dimensional computed tomographic preoperative templating with either st and ard instrumentation or with patient-specific instrumentation and were compared with a nonr and omized group of seventeen patients with two-dimensional imaging and st and ard instrumentation used as historical controls . All patients had postoperative three-dimensional computed tomographic metal artifact reduction imaging to measure and to compare implant position with the preoperative plan . RESULTS Using three-dimensional imaging and templating with or without patient-specific instrumentation , there was a significant improvement achieving the desired implant position within 5 ° of inclination or 10 ° of version when compared with two-dimensional imaging and st and ard instrumentation . CONCLUSION Three-dimensional assessment of glenoid anatomy and implant templating and the use of these images at the time of surgery improve the surgeon 's ability to place the glenoid implant in the desired location
11,046
24,558,020
There is very low quality evidence that is insufficient to determine whether there is any significant clinical benefit in using fewer-than-four-ports laparoscopic cholecystectomy compared with four-port laparoscopic cholecystectomy .
BACKGROUND Traditionally , laparoscopic cholecystectomy is performed using two 10-mm ports and two 5-mm ports . Recently , a reduction in the number of ports has been suggested as a modification of the st and ard technique with a view to decreasing pain and improving cosmesis . The safety and effectiveness of using fewer-than-four ports has not yet been established . OBJECTIVES To assess the benefits ( such as improvement in cosmesis and earlier return to activity ) and harms ( such as increased complications ) of using fewer-than-four ports ( fewer-than-four-ports laparoscopic cholecystectomy ) versus four ports in people undergoing laparoscopic cholecystectomy for any reason ( symptomatic gallstones , acalculous cholecystitis , gallbladder polyp , or any other condition ) .
Introduction . Laparoendoscopic single-site surgery ( LESS ) uses a multiple-entry portal in a single 3.0- to 4.0-cm incision in a natural scar , the umbilicus . The present study aim ed to compare the inflammatory impact of classic video laparoscopic cholecystectomy ( LC ) versus LESS cholecystectomy . Methods . A prospect i ve r and omized controlled study was conducted from January to June 2011 at 2 university hospitals in Rio de Janeiro , Brazil . Fifty-seven patients ( 53 women , 4 men ; mean age = 48.7 years ) were r and omly assigned to receive LC ( n = 29 ) or LESS ( n = 28 ) cholecystectomy . C-reactive protein ( CRP ) and interleukin 6 ( IL-6 ) were measured from blood sample s collected during induction of anesthesia and at 3 and 24 hours postoperatively . Results . Median IL-6 levels in the LESS and LC groups , respectively , were 2.96 and 4.5 pg/mL preoperatively , 11.6 and 28.05 pg/mL at 3 hours postoperatively ( P = .029 ) , and 13.18 and 15.1 pg/mL at 24 hours postoperatively ( P = .52 ) . Median CRP levels in the LESS and LC groups , respectively , were 0.33 and 0.44 mg/mL preoperatively , 0.40 and 0.45 mg/mL ( P = .73 ) at 3 hours postoperatively , and 1.7 and 1.82 mg/mL ( P = .84 ) at 24 hours postoperatively . We did not find a significant association between IL-6 ( and CRP ) and body mass index in the LESS group . Conclusions . LESS cholecystectomy requires a larger size incision than LC . We found a tendency of less postoperative pain following LESS cholecystectomy than LC . There was also a tendency toward lower early inflammatory impact following LESS cholecystectomy versus LC Purpose : To evaluate the safety and feasibility of single-incision laparoscopic cholecystectomy ( SILS-C ) compared with conventional laparoscopic cholecystectomy ( CLC ) . Methods : Sixty-five patients ( SILS-C : 35 , CLC : 30 ) were prospect ively enrolled and operated with conventional straight instruments . The postoperative pain scores at 6 , 24 hours , and 1 week , nausea , vomiting , commencement of oral intake , hospital stay , resumption of normal activities and work and satisfaction levels were noted . Results : Twenty-eight percent ( 10/35 ) SILS-C patients required introduction of additional trocars to complete the procedure . No patient required conversion to open . All the morbidity parameters were similar in both the groups , except that the seroma formation in the wound was significantly higher in the SILS-C group [ SILS-C : 17 % (6/35)/CLC : 0 % , P=0.038 ] . One patient in SILS-C had a major bile duct injury . Conclusions : SILS-C is safe and feasible with conventional instruments . However , caution needs to be exercised in view of a major bile duct injury and a higher rate of seroma formation in the wound Background : Two-port laparoscopic cholecystectomy has been reported to be safe and feasible . However , whether it offers any additional advantages remains controversial . This study reports a r and omized trial that compared the clinical outcomes of two-port laparoscopic cholecystectomy versus conventional four-port laparoscopic cholecystectomy . Methods : One hundred and twenty consecutive patients who underwent elective laparoscopic cholecystectomy were r and omized to receive either the two-port or the four-port technique . All patients were blinded to the type of operation they underwent . Four surgical tapes were applied to st and ard four-port sites in both groups at the end of the operation . All dressings were kept intact until the first follow-up 1 week after surgery . Postoperative pain at the four sites was assessed on the first day after surgery using a 10-cm unscaled visual analog scale ( VAS ) . Other outcome measures included analgesia requirements , length and difficulty of the operation , postoperative stay , and patient satisfaction score on surgery and scars . Results : Demographic data were comparable for both groups . Patients in the two-port group had shorter mean operative time ( 54.6 ± 24.7 min vs 66.9 ± 33.1 min for the four-post group ; p = 0.03 ) and less pain at individual subcostal port sites [ mean score using 10-cm unscaled VAS : 1.5 vs 2.8 ( p = 0.01 ) at the midsubcostal port site and 1.3 vs 2.3 ( p = 0.02 ) at the lateral subcostal port site ] . Overall pain score , analgesia requirements , hospital stay , and patient satisfaction score on surgery and scars were similar between the two groups . Conclusion : Two-port laparoscopic cholecystectomy result ed in less individual port-site pain and similar clinical outcomes but fewer surgical scars compared to four-port laparoscopic cholecystectomy . Thus , it can be recommended as a routine procedure in elective laparoscopic cholecystectomy Objective : To compare short-term surgical outcomes and quality of life ( QOL ) between single-port laparoscopic cholecystectomy ( SPLC ) and classic 4-port laparoscopic cholecystectomy ( CLC ) . Background : There is significant interest in further reducing the trauma associated with surgical procedures . Although a number of observational studies have suggested that SPLC is a feasible alternative to CLC , there is a lack of data from r and omized studies validating any benefit over CLC . Methods : Eligible patients were r and omized to receive SPLC or CLC . Operative and perioperative outcomes , including cosmesis and QOL were analyzed . Results : Forty-three patients were r and omized to SPLC ( n = 21 ) or CLC ( n = 22 ) . There were no significant differences between groups for most preoperative demographics , American Society of Anesthesiology score , gallstone characteristics , local inflammation , blood loss , or length of stay . Patients undergoing SPLC were older than those receiving CLC ( 57.3 years vs. 45.8 years , P < 0.05 ) . Operative times for SPLC were greater than CLC ( 88.5 minutes vs. 44.8 minutes , P < 0.05 ) . Overall and cosmetic satisfaction , QOL as determined by the SF-36 survey , postoperative complications , and post-operative pain scores between discharge and 2-week postoperative visit were not significantly different between groups . Wound infection rates were similar in both groups . The SPLC group contained 1 retained bile duct stone , 1-port site hernia , and 1 postoperative port site hemorrhage . Conclusions : SPLC procedure time was longer and incurred more complications than CLC without significant benefits in patient satisfaction , postoperative pain and QOL . SPLC may be offered in carefully selected patients . Larger r and omized trials performed later in the learning curve with SPLC may identify more subtle advantages of one method over another Purpose : We report the outcomes of a r and omized clinical trial of single-port laparoscopic cholecystectomy ( SPLC ) and multiport laparoscopic cholecystectomy ( MPLC ) . Methods : Fifty-four patients ( 27 in each group ) were r and omized . A visual analog scale was used with a 10-point scale for an objective assessment of incisional pain and incisional cosmesis on postoperative days 1 , 3 , and 14 . Results : The mean operating time was significantly longer in the SPLC . The mean cosmesis scores on postoperative days 3 ( 9.7 vs. 8.9 , P=0.01 ) and 14 ( 9.9 vs. 9.2 , P<0.01 ) were significantly greater in the SPLC group than in the MPLC group . The group ’s mean visual analog scale scores for incisional pain , and their requirements for analgesics , did not differ significantly . Conclusions : Although SPLC takes longer than MPLC , experienced laparoscopic surgeons can perform SPLC safely with results comparable with those for MPLC . SPLC is superior to MPLC in terms of short-term cosmetic outcomes Although laparoscopic cholecystectomy has rapidly developed in the treatment of gall bladder disease in the absence of controlled clinical trial data its outcome parameters compared with open cholecystectomy remain unclear . A prospect i ve audit of the introduction of laparoscopic cholecystectomy in the west of Scotl and over a two year period was carried out to attempt to assess this new procedure . A total of 45 surgeons in 19 hospitals performing laparoscopic cholecystectomy su bmi tted prospect i ve data from September 1990 - 1992 . A total of 2285 cholecystectomies were audited ( a completed data collection rate of 99 % ) . Laparoscopic cholecystectomy was attempted in 1683 ( 74 % ) patients and completed in 1448 patients ( median conversion rate to the open procedure 17 % ) . The median operation time in the completed laparoscopic cholecystectomy patients was 100 minutes ( range 30 - 330 ) and overall hospital stay three days ( 1 - 33 ) . There were nine deaths ( 0.5 % ) after laparoscopic cholecystectomy although only two were directly attributable to the laparoscopic procedure . In the laparoscopic cholecystectomy group there were 99 complications ( 5.9 % ) , 53 ( 3 % ) of these were major requiring further invasive intervention . Forty patients ( 2.4 % ) required early or delayed laparotomy for major complications such as bleeding or bile duct injuries . There were 11 ( 0.7 % ) bile duct injuries in the laparoscopic cholecystectomy series , five were noted during the initial procedure and six were recognised later result ing from jaundice or bile leaks . Ductal injuries occurred after a median of 20 laparoscopic cholecystectomies . In conclusion laparoscopic cholecystectomy has rapidly replaced open cholecystectomy in the treatment of gall bladder disease . Although the overall death and complication rate associated with laparoscopic cholecystectomy is similar to open cholecystectomy , the bile duct injury rate is higher The main objectives of minisite cholecystectomy ( MC ) are to have smaller incisions , better cosmetic results , less trauma , and a lower morbidity rate . This prospect i ve r and omized study compares MC with conventional laparoscopic cholecystectomy ( CLC ) in terms of surgical trauma and cosmetic results in 44 patients . Conversion from MC to CLC was required in five patients . No conversion to open surgery was needed in the CLC group . The average operating time was slightly longer in the MC group , but the difference was not statistically significant ( 81 minutes versus 72 minutes , p=0.22 ) . The population characteristics , postoperative respiratory function measurements , pain scores , and analgesic requirements were similar in the two groups . The average score for scar tissue was significantly lower in the MC group ( 0.73 versus 1.93 , p=0.0045 ) . Only the cosmetic results of MC were superior to CLC . This technique could be a feasible alternative procedure in patients seeking better cosmetic results . However , further studies with larger sample sizes are needed to evaluate the postoperative morbidity of MC.RésuméLes objectifs principaux de la cholécystectomie laparoscopique minime ( CM ) ( par mini-trocars ) sont de réaliser de plus petites incisions , d’obtenir un meilleur résultat esthétique , de provoquer moins de trauma et d’avoir un taux plus bas de morbidité que la cholécystectomie par laparoscopic classique . Cette étude prospect i ve r and omisée a comparé la CM à la cholécystectomie par laparoscopic ( CL ) traditionnelle en termes de traumatisme chirurgical et de résultat esthétique chez 44 patients . On a du convertir la CM en CL chez cinq patients . Aucune conversion vers la chirurgie ouverte n’a été nécessaire dans le groupe de CL traditionnelle . Le temps moyen d’intervention a été plus long dans le groupe CM mais cette différence n’était pas statistiquement significative ( 81 min vs. 72 min , p=0.22 ) . Les caractéristiques de la population , les fonctions respiratoires , les scores de la douleur et les besoins en analgésiques étaient similaires dans les deux groupes . Les scores moyens d’évaluation de la cicatrice ont été significativement plus bas dans le groupe CM ( 0.73 vs. 1.93 , p=0.0045 ) . Seuls les résultats esthétiques de la CM étaient supérieurs à la CL . Cette technique pourrait être une alternative valable chez les patients cherchant une amélioration des résultats esthétiques . Cependant , d’autres études avec des échantillons plus larges sont nécessaires pour évaluer la morbidité postopératoire de la CM.ResumenEl propósito principal de la colecistectomía de invasión mínima ( CIM ) es lograr una cicatriz pequeña , mejor apariencia estética , menos trauma y morbilidad reducida . El présente estudio prospect i ve y aleatorizado compara la CIM con la colecistectomía laparoscópica convencional ( CLC ) en cuanto a trauma operatorio y result ado estético en 44 pacientes . Se requirió conversión de CIM a CLC en cinco pacientes , pero ninguna CLC tuvo que ser convertida a cirugía abierta . El promedio del tiempo operatorio fue ligeramente más prolongado en la CIM , pero la diferencia no es estadísticamente significativa ( 81 min versus 72 min , p=0.22 ) . Las características demográficas de los dos grupos eran similares y las determinaciones postoperatorias de función respiratoria , grado de dolor y requerimiento analgésico fueron sensiblemente iguales . El grado de tejido cicatricial fue significatamente menor en el grupo de la CIM ( 0.73 versus 1.93 , p=0.0045 ) . Sólo el result ado estético de la CIM fue superior al de la CLC . Esta técnica podría ser una alternativa factible en pacientes que busquen un mejor result ado estético . Sin embargo , se requieren estudios adicionales y grupos de pacientes más numerosos para evaluar la morbilidad postoperatoria de la CIM BACKGROUND In recent years , new devices providing multiple channels have made the performance of laparoscopic cholecystectomy through a single access site not only feasible but much easier . The potential benefits of laparoendoscopic single-site ( LESS ) cholecystectomy may include scarless surgery , reduced postoperative pain , reduced postoperative length of stay , and improved postoperative quality of life . There are no comparative data between LESS cholecystectomy and st and ard laparoscopic cholecystectomy ( LC ) available at present with which to quantify these benefits . METHODS This study was a prospect i ve , r and omized , dual-institutional pilot trial comparing LESS cholecystectomy with st and ard LC . The primary end point was postoperative quality of life , measured as length of hospital stay , postoperative pain , cosmetic results , and SF-36 question naire scores . Secondary end points included operative time , conversion to st and ard LC , difficulty of exposure , difficulty of dissection , and complication rate . RESULTS No significant differences in postoperative lengths of stay were found in the two groups . Postoperative pain evaluation using a visual analogue scale showed significantly better outcomes in the st and ard LC arm on the same day of surgery ( P = .041 ) . No differences in postoperative pain were found at the next visual analogue scale evaluation or in the postoperative administration of pain-relieving medications . Cosmetic satisfaction was significantly higher in the LESS group at 1-month follow-up ( mean , 94.5 ± 9.4 % vs 86 ± 22.3 % ; median , 100 % vs 90 % ; P = .025 ) . Among the 8 scales of the SF-36 assessing patients ' physical and mental health , scores on the Role Emotional scale were significantly better in the LESS group ( mean , 80.05 ± 29.42 vs 68.33 ± 25.31 ; median , 100 vs 66.67 ; P < .0001 ) . CONCLUSIONS In this pilot trial , LESS cholecystectomy result ed in similar lengths of stay and improved cosmetic results and SF-36 Role Emotional scores but performed less well on pain immediately after surgery . A larger multicenter trial is needed to confirm and further investigate these results OBJECTIVES To analyze sources search ed in Cochrane review s , to determine the proportion of trials included in review s that are indexed in major data bases , and to compare the quality of these trials with those from other sources . METHODS All new systematic review s in the Cochrane Library , Issue1 2001 , that were restricted to r and omized controlled trials ( RCTs ) or quasi- RCTs were selected . The sources search ed in the review s were recorded , and the trials included were checked to see whether they were indexed in four major data bases . Trials not indexed were checked to determine how they could be identified . The quality of trials found in major data bases was compared with those found from other sources . RESULTS The range in the number of data bases search ed per review ranged between one and twenty-seven . The proportion of the trials in the four data bases were Cochrane Controlled Trials Register = 78.5 % , MEDLINE = 68.8 % , Embase = 65.0 % , and Science/Social Sciences Citation Index = 60.7 % . Search ing another twenty-six data bases after Cochrane Controlled Trials Register ( CCTR ) , MEDLINE , and Embase only found 2.4 % additional trials . There was no significant difference between trials found in the CCTR , MEDLINE , and Embase compared with other trials , with respect to adequate allocation concealment or sample size . CONCLUSIONS There was a large variation between review s in the exhaustiveness of the literature search es . CCTR was the single best source of RCTs . Additional data base search ing retrieved only a small percentage of extra trials . Contacting authors and manufacturers to find unpublished trials appeared to be a more effective method of obtaining the additional better quality trials BACKGROUND This study aim ed to compare the outcomes of single-incision laparoscopic cholecystectomy ( SILC ) versus conventional 4-port laparoscopic cholecystectomy ( LC ) . METHODS From November 2009 to August 2010 , 51 patients with symptomatic gallstone or gallbladder polyps were r and omized to SILC ( n = 24 ) or 4-port LC ( n = 27 ) . RESULTS Mean surgical time ( 43.5 vs 46.5 min ) , median blood loss ( 1 vs 1 mL ) and mean hospital stay ( 1.5 vs 1.8 d ) were similar for both the SILC and 4-port LC group . There were no open conversions and no major complications . The mean total wound length of the SILC group was significantly shorter ( 1.76 vs 2.25 cm ) . The median visual analogue pain score at 6 hours after surgery was similar ( 4.5 vs 4.0 ) but the SILC group had a significantly worse pain score on day 7 ( 1 vs 0 ) . There was no difference in time to resume usual activity ( mean , 5.6 vs 5.0 d ) . The median cosmetic score of SILC was significantly higher than at 3 months after surgery ( 7 vs 6 ) . CONCLUSIONS SILC was feasible and safe for properly selected patients in experienced h and A r and om sample of 4,807 men and women , aged 30 , 40 , 50 , and 60 years , who lived in the western part of Copenhagen County , was drawn from the National Central Person Registry . A total of 226 subjects who were not of Danish origin were omitted . The response rate was 78.8 % ( 3,608/4,581 ) . Each person had his or her gallbladder examined by ultrasonography . The examinations took place between November 1982 and February 1984 . The overall prevalence of gallstone disease ( cases with stones and cholecystectomized cases ) in males aged 30 , 40 , 50 , and 60 years was 1.8 % , 1.5 % , 6.7 % , and 12.9 % , respectively . The corresponding prevalence in females was 4.8 % , 6.1 % , 14.4 % , and 22.4 % , respectively . Differences according to sex were significant in all age groups . Differences between the 40- and 50-year and 50- and 60-year age groups were significant in both sexes . Among subjects with gallstone disease , the disease was unknown to the prob and in the majority of males and in the 30-year-old females , but only in half of the women aged 40 , 50 , and 60 years . The prevalence of clinical ly diagnosed gallstones was not significantly different between respondents and nonrespondents Objectives : With increasing surgeon experience , laparoscopic cholecystectomy has undergone many refinements including reduction in port number and size . Three-port laparoscopic cholecystectomy has been reported to be safe and feasible in various clinical trials . However , whether it offers any additional advantages remains controversial . This study reports a r and omized trial that compared the clinical outcomes of 3-port laparoscopic cholecystectomy versus conventional 4-port laparoscopic cholecystectomy . Methods : Seventy-five consecutive patients who underwent elective laparoscopic cholecystectomy were r and omized to undergo either the 3-port or the 4-port technique . Four surgical tapes were applied to st and ard 4-port sites in both groups at the end of the operation . All dressings were kept intact until the first follow-up 1 week after surgery . Postoperative pain at the 4 sites was assessed on the first day after surgery by using a 10-cm unscaled visual analog scale ( VAS ) . Other outcome measures included analgesia requirements , length of the operation , postoperative stay , and patient satisfaction score on surgery and scars . Results : Demographic data were comparable for both groups . Patients in the 3-port group had shorter mean operative time ( 47.3±29.8 min vs 60.8±32.3 min ) for the 4-port group ( P=0.04 ) and less pain at port sites ( mean score using 10-cm unscaled VAS : 2.19±1.06 vs 2.91±1.20 ( P=0.02 ) . Overall pain score , analgesia requirements , hospital stay , and patient satisfaction score ( mean score using 10-cm unscaled VAS : 8.2±1.7 vs 7.8±1.7 , P=0.24 ) on surgery and scars were similar between the 2 groups . Conclusion : Three-port laparoscopic cholecystectomy result ed in less individual port-site pain and similar clinical outcomes with fewer surgical scars and without any increased risk of bile duct injury compared with 4-port laparoscopic cholecystectomy . Thus , it can be recommended as a safe alternative procedure in elective laparoscopic cholecystectomy Background Emerging attempts have been made to reduce operative trauma and improve cosmetic results of laparoscopic cholecystectomy . There is a trend towards minimizing the number of incisions such as natural transluminal endoscopic surgery ( NOTES ) and single-port laparoscopic cholecystectomy ( SPLC ) . Many retrospective case series propose excellent cosmesis and reduced pain in SPLC . As the latter has been confirmed in a r and omized controlled trial , patient 's satisfaction on cosmesis is still controversially debated . Methods / Design The SPOCC trial is a prospect i ve , multi-center , double blinded , r and omized controlled study comparing SPLC with 4-port conventional laparoscopic cholecystectomy ( 4PLC ) in elective surgery . The hypothesis and primary objective is that patients undergoing SPLC will have a better outcome in cosmesis and body image 12 weeks after surgery . This primary endpoint is assessed using a vali date d 8-item multiple choice type question naire on cosmesis and body image . The secondary endpoint has three entities : the quality of life 12 weeks after surgery assessed by the vali date d Short-Form-36 Health Survey question naire , postoperative pain assessed by a visual analogue scale and the use of analgesics . Operative time , surgeon 's experience with SPLC and 4PLC , use of additional ports , conversion to 4PLC or open cholecystectomy , length of stay , costs , time of work as well as intra- and postoperative complications are further aspects of the secondary endpoint . Patients are r and omly assigned either to SPLC or to 4PLC . Patients as well as treating physicians , nurses and assessors are blinded until the 7th postoperative day . Sample size calculation performed by estimating a difference of cosmesis of 20 % ( alpha = 0.05 and beta = 0.90 , drop out rate of 10 % ) result ed in a number of 55 r and omized patients per arm . Discussion The SPOCC-trial is a prospect i ve , multi-center , double-blind , r and omized controlled study to assess cosmesis and body image after SPLC.Trial registration ( clinical trial.gov ) : NCT BACKGROUND This study presents preliminary data from a prospect i ve r and omized multicenter , single-blinded trial of single-incision laparoscopic cholecystectomy ( SILC ) versus st and ard laparoscopic cholecystectomy ( 4PLC ) . METHODS Patients with symptomatic gallstones , polyps , or biliary dyskinesia ( ejection fraction < 30 % ) were r and omized to SILC or 4PLC . Data included operative time , estimated blood loss , length of skin and fascial incisions , complications , pain , satisfaction and cosmetic scoring , and conversion . RESULTS Operating room time was longer with SILC ( n = 50 ) versus 4PLC ( n = 33 ) . No differences were seen in blood loss , complications , or pain scores . Body image scores and cosmetic scores at 1 , 2 , 4 , and 12 weeks were significantly higher for SILC . Satisfaction scores , however , were similar . CONCLUSIONS Preliminary results from this prospect i ve trial showed SILC to be safe compared with 4PLC although operative times were longer . Cosmetic scores were higher for SILS compared with 4PLC . Satisfaction scores were similar although both groups reported a significantly higher preference towards SILC BACKGROUND Minimally invasive techniques have become an integral part of general surgery with recent investigation into single-incision laparoscopic cholecystectomy ( SILC ) . This study presents the final 1-year results of a prospect i ve , r and omized , multicenter , single-blinded trial of SILC vs multiport cholecystectomy ( 4PLC ) . STUDY DESIGN Patients with biliary colic and documented gallstones or polyps or with biliary dyskinesia were r and omized to SILC vs 4PLC . Data measures included operative details , adverse events , and conversion to 4PLC or laparotomy . Patients were followed for 12 months . RESULTS Two hundred patients underwent r and omization to SILC ( n = 119 ) or 4PLC ( n = 81 ) . Enrollment ranged from 1 to 50 patients with 4 sites enrolling > 25 patients . Total adverse events were not significantly different between groups ( 36 % 4PLC vs 45 % SILC ; p = 0.24 ) , as were severe adverse events ( 4 % 4PLC vs 10 % SILC ; p = 0.11 ) . Incision-related adverse events were higher after SILC ( 11.7 % vs 4.9 % ; p = 0.13 ) , but all of these were listed as mild or moderate . Total hernia rates were 1.2 % ( 1 of 81 ) in 4PLC patients vs 8.4 % ( 10 of 119 ) in SILC patients ( p = 0.03 ) . At 1-year follow-up , cosmesis scores continued to favor SILC ( p < 0.0001 ) . CONCLUSIONS Results of this trial show SILC to be a safe and feasible procedure when compared with 4PLC , with similar total adverse events but with an identified significant increase in hernia formation . Cosmesis scoring and patient preference at 12 months continue to favor SILC , and more than half of the patients were willing to pay more for a single-site surgery over a st and ard laparoscopic procedure . Additional longer-term population -based studies are needed to clarify if this increased rate of hernia formation as compared with 4PLC will continue to hold true Background The attempt to further reduce operative trauma in laparoscopic cholecystectomy has led to new techniques such as natural orifice transluminal endoscopic surgery ( NOTES ) and single-incision laparoscopic surgery ( SILS ) . These new techniques are considered to be painless procedures , but no published studies investigate the possibility of different pain scores in these new techniques versus classic laparoscopic cholecystectomy . In this r and omized control study , we investigated pain scores in SILS cholecystectomy versus classic laparoscopic cholecystectomy . Patients and methods Forty patients ( 34 women and 6 men ) were r and omly assigned to two groups . In group A ( n = 20 ) four-port classic laparoscopic cholecystectomy was performed . Patients in group B ( n = 20 ) underwent SILS cholecystectomy . In all patients , preincisional local infiltration of ropivacaine around the trocar wounds was performed . Infusion of ropivacaine solution in the right subdiaphragmatic area at the beginning of the procedure plus normal saline infusion in the same area at the end of the procedure was performed in all patients as well . Shoulder tip and abdominal pain were registered at 2 , 6 , 12 , 24 , 48 , and 72 h postoperatively using visual analog scale ( VAS ) . Results Significantly lower pain scores were observed in the SILS group versus the classic laparoscopic cholecystectomy group after the first 12 h for abdominal pain , and after the first 6 h for shoulder pain . Total pain after the first 24 h was nonexistent in the SILS group . Also , requests for analgesics were significantly less in the SILS group , while no difference was observed in incidence of nausea and vomiting between the two groups . ConclusionS ILS cholecystectomy , as well as the invisible scar , has significantly lower abdominal and shoulder pain scores , especially after the first 24 h postoperatively , when this pain is nonexistent . ( Registration Clinical Trial number : NTC00872287 , www . clinical trials.gov ) BACKGROUND Laparoscopy through a single umbilical incision is an emerging technique supported by case series , but prospect i ve comparative data are lacking . Therefore , we conducted a prospect i ve , r and omized trial comparing single site umbilical laparoscopic cholecystectomy to 4-port laparoscopic cholecystectomy . METHODS After IRB approval , patients were r and omized to laparoscopic cholecystectomy via a single umbilical incision or st and ard 4-port access . The primary outcome variable was operative time . Utilizing a power of 0.8 and an alpha of 0.05 , 30 patients were calculated for each arm . Patients with complicated disease or weight over 100 kg were excluded . Post-operative management was controlled . Surgeons subjectively scored degree of technical difficulty from 1=easy to 5=difficult . RESULTS From 8/2009 through 7/2011 , 60 patients were enrolled . There were no differences in patient characteristics . Operative time and degree of difficulty were greater with the single site approach . There were more doses of analgesics used and greater hospital charges in the single site group that trended toward significance . CONCLUSION Single site laparoscopic cholecystectomy produces longer operative times with a greater degree of difficulty as assessed by the surgeon . There was a trend toward more doses of post-operative analgesics and greater hospital charges with the single site approach Methods for combining data from several studies exist and appear to be quite useful . None satisfactorily addresses the question of what studies should be combined . This issue is the most serious method ological limitation . Even studies with statistically significant interaction might still be combined if the effect were in the same direction . Thus , substantial scientific input is required as to what criteria must be met by each potential study . Much can be learned from combining or pooling data but it must be done cautiously . Pooling exercises do not replace well design ed prospect i ve clinical trials . Efforts for establishing basic design criteria to allow for multicentre and multicountry trials to be more easily combined might be useful . Background Natural orifice transluminal endoscopic surgery ( NOTES ) and transumbilical endoscopic surgery ( TUES ) are being developed to improve minimally invasive surgery further . In 2006 , the authors developed TUES using a single triple-channel trocar or single-trocar ( ST ) technique . To minimize the risk and improve the surgical efficiency further , the procedure was optimized using a two-trocar ( TT ) technique , with both trocars in the umbilicus . This study compared the clinical results for the TT and ST techniques . Methods For this study , 32 patients with chronic gallbladder disease and indications for cholecystectomy were r and omly assigned to undergo surgery with either the TT technique ( 17 patients ) or the ST technique ( 15 patients ) . With the TT procedure , two modified 5-mm trocars with small h and les were inserted through the navel , one above and one below the umbilicus . Another 2-mm trocar was inserted for a grasper in the right upper abdomen . With the ST procedure , one 15-mm umbilical incision was made for insertion of a previously developed triple-channel trocar to apply the laparoscope , grasper , and dissector individually . Operation time , postoperative hospital stay , and postoperative pain were compared between the two procedures . Results The mean operative time was significantly shorter with the TT technique ( 35.71 ± 9.74 min ) than with the ST technique ( 125.25 ± 18.9 min ( p < 0.001 ) . Use of analgesics after surgery also was less in the TT group than in the ST group ( 0 vs. 7 , respectively ; p < 0.05 ) . The postoperative hospital stay did not differ significantly between the two groups ( p > 0.05 ) . Conclusions Although both procedures were based on the transumbilical approach , the TT approach was found to be faster and less painful than the ST approach . The difference in the cosmetic result was minimal BACKGROUND The single-incision laparoscopic approach for cholecystectomy has been reported to be cosmetically superior in the traditional four-port technique in several case series ; however , prospect i ve comparative data are lacking . We conducted a 60-patient , prospect i ve , r and omized trial comparing single-incision laparoscopic cholecystectomy with st and ard four-port cholecystectomy , including vali date d scar assessment evaluation around 6 weeks and 18 months after the operation in an effort to determine if a cosmetic advantage existed . PATIENTS AND METHODS Patients over 12 years of age and parents of patients under 12 years of age enrolled in the trial were asked to complete the vali date d Patient Scar Assessment Question naire ( PSAQ ) . The PSAQ consists of four subscales : Appearance , Consciousness , Satisfaction with Appearance , and Satisfaction with Symptoms . The Symptoms subscale is omitted from analysis per PSAQ instructions because of insufficient reliability . Each subscale is a set of items with 4-point categorical responses ( from 1=most favorable to 4=least favorable ) . The sum of the questions quantifies each subscale . Data are expressed as mean±st and ard deviation values . RESULTS Eighteen single-site patients and 8 four-port patients completed early question naires , in which there was no difference in overall scar assessment ( P=.17 ) . Telephone follow-up was accomplished for 17 single-site patients and 24 four-port patients and revealed that the overall scar assessment significantly favored the single-site approach ( P=.04 ) . CONCLUSIONS Patients or parents of patients do not identify an overall superior scar assessment at early follow-up after single-site laparoscopic versus four-port cholecystectomy . However , they do perceive a superior scar assessment at long-term follow-up , suggesting that there is a cosmetic benefit favoring the single-site approach Background Minimally invasive techniques have become an integral part of general surgery , with recent investigation into single-incision laparoscopic cholecystectomy ( SILC ) . This study presents a prospect i ve , r and omized , multicenter , single-blind trial of SILC compared with four-port cholecystectomy ( 4PLC ) with the goal of assessing safety , feasibility , and factors predicting outcomes . Methods Patients with biliary colic and documented gallstones or polyps or with biliary dyskinesia were r and omized to SILC or 4PLC . Data measures included operative details , adverse events , and conversion to 4PLC or laparotomy . Pain , cosmesis , and quality -of-life scores were documented . Patients were followed for 12 months . Results Two hundred patients were r and omized to SILC ( n = 117 ) or 4PLC ( n = 80 ) ( 3 patients chose not to participate after r and omization ) . Patients were similar except for body mass index ( BMI ) , which was lower in the SILC patients ( 28.9 vs. 31.0 , p = 0.011 ) . One SILC patient required conversion to 4PLC . Operative time was longer for SILC ( 57 vs. 45 min , p < 0.0001 ) , but outcomes , including total adverse events , were similar ( 34 % vs. 38 % , p = 0.55 ) . Cosmesis scores favored SILC ( p < 0.002 ) , but pain scores were lower for 4PLC ( 1 point difference in 10-point scale , p < 0.028 ) despite equal analgesia use . Wound complications were greater after SILC ( 10 % vs. 3 % , p = 0.047 ) , but hernia recurrence was equivalent for both procedures ( 1.3 % vs. 3.4 % , p = 0.65 ) . Univariate analysis showed female gender , SILC , and younger age to be predictors for increased pain scores , while SILC was associated with improved cosmesis scores . Conclusions In this multicenter r and omized controlled trial of SILC versus 4PLC , SILC appears to be safe with a similar biliary complication profile . Pain scores and wound complication rates are higher for SILC ; however , cosmesis scores favored SILC . For patients preferring a better cosmetic outcome and willing to accept possible increased postoperative pain , SILC offers a safe alternative to the st and ard 4PLC . Further follow-up is needed to detail the long-term risk of wound morbidities , including hernia recurrence Background The aim of this study was to compare the outcomes of single-incision laparoscopic cholecystectomy ( SILC ) and conventional laparoscopic cholecystectomy ( CLC ) . Method Patients ’ inclusion criteria were uncomplicated gallstones , BMI ≤30 , ASA score ≤2 , and no past surgery in the upper abdomen . Five surgeons performed only SILC and seven only CLC . Data analyzed included operative time , morbidity , quality of life ( QOL ) , cosmetic result , and global patient satisfaction . The last three parameters were evaluated 3 months after surgery . QOL was assessed with the Gastrointestinal Quality of Life Index ( GIQLI ) question naire . Cosmetic result and patient satisfaction were rated using a 5- grade Likert scale . Results This study included 104 patients operated on between April and June 2010 . A SILC was performed in 35 patients and a CLC in 69 . The preoperative characteristics of the two groups were similar . Median operative time for SILC was higher than that for CLC : 55 versus 40 min ( p < 0.001 ) . Postoperative complications ( 0 vs. 2 ) and postoperative GIQLI scores ( 123 ± 13 vs. 121 ± 18 ) were not significantly different between groups . Cosmetic result and patient satisfaction were better for SILC than for CLC . The percentages of results rated as excellent were 68 versus 37 % ( p < 0.006 ) and 80 versus 57 % ( p < 0.039 ) , respectively . For the whole group , multivariate statistical analysis revealed that postoperative GIQLI score and cosmetic result were independent predictive factors of patient satisfaction . The percentages of satisfaction rated as excellent were greater in patients who had a postoperative GIQLI score ≥130 ( 92 vs. 49 % , odds ratio [ OR ] = 4 , p < 0.001 ) and in patients who had an excellent cosmetic result ( 82 vs. 47 % , OR = 7 , p < 0.001 ) . Conclusions Compared to CLC , SILC is associated with a longer operative time , an equivalent morbidity and QOL , and a better cosmetic result . The improved aesthetic result also leads to a better global patient satisfaction Single‐incision laparoscopic cholecystectomy ( SILC ) is increasingly practice d , but there have been no well‐powered r and omized trials investigating the technique . This non‐inferiority trial aims to compare SILC with conventional four‐port laparoscopic cholecystectomy ( LC ) with postoperative pain as the primary endpoint INTRODUCTION : A novel single port access ( SPA ) cholecystectomy approach is described in this study . We have design ed a r and omised comparative study in order to eluci date any possible differences between the st and ard treatment and this novel technique . MATERIAL S AND METHODS : Between July 2009 and March 2010 , 140 adult patients with gallbladder pathologies were enrolled in this multicentre study . Two surgeons ( RV and UB ) r and omised patients to either a st and ard laparoscopic ( SL ) approach group or to an SPA cholecystectomy group . Two types of trocars were used for this study : the TriPort ™ and the SILS ™ Port . Outcomes including blood loss , operative time , complications , length of stay and pain were recorded . RESULTS : There were 69 patients in the SPA group and 71 patients in the SL group . The mean age of the patients was 43.2 ( 17 - 77 ) for the SPA group and 42.6 ( 19 - 70 ) for the SL group . The mean operative time was 63.9 min in the SPA group and 58.4 min in the SL group . For one patient , the SPA procedure was converted to a st and ard laparoscopic technique and to open approach in the SL group . Complications occurred in eight patients : Five seromas ( two in the SPA group ) and three hernias ( one in the SPA group).The mean hospital stay was 38.5 h in the SPA group and 24.1 h in the SL group . Pain was evaluated and was 2 in the SPA and 2.9 in the SL group , according to the visual analogue scale ( VAS ) after 24 h ( P<0.001 ) . The degree of satisfaction was higher in the SPA group ( 8.3 versus 6.7 ) . Similar results were found for the aesthetic result ( 8.8 versus 7.5 ) . ( P<0.001 ) . CONCLUSION : Single-port transumbilical laparoscopic cholecystectomy can be feasible and safe . When technical difficulties arise , early conversion to a st and ard laparoscopic technique is advised to avoid serious complications . The SPA approach can be undertaken without the expense of additional operative time and provides patients with minimal scarring . The cosmetic results and the degree of satisfaction appear to be significant for the SPA approach Background Laparoendoscopic single-site surgery ( LESS ) has emerged as a technique that uses a natural scar , the umbilicus , within which a multiple-entry portal is placed into a 3.0–4.0-cm single incision to perform operations . The objective of this study was to compare incision size , wound complications , and postoperative pain of LESS compared with those of laparoscopic cholecystectomy ( LC ) . Methods A prospect i ve r and omized controlled study was conducted between January and June 2011 at two university hospitals in Rio de Janeiro , Brazil . Fifty-seven patients were r and omly assigned to undergo laparoscopic or LESS cholecystectomy . Skin and aponeurosis wound sizes were recorded . A 10-point visual analog scale ( VAS ) was used to assess pain at postoperative hours 3 and 24 . Healing and wound complications were assessed at follow-up . Results A total of 57 patients , 53 women and 4 men with a mean age of 48.7 years , were r and omly assigned to undergo LESS ( n = 28 ) or LC ( n = 29 ) . The mean length of the umbilical skin incision was 4.0 cm ( range = 2.1–5.8 ) in LESS and 2.7 cm ( 1.5–5.1 ) in LC ( p < .0001 ) . The mean internal aponeurosis diameter was 3.5 cm ( 2.0–5.5 ) in LESS and 2.3 cm ( 1.2–3.5 ) in LC ( p < .0001 ) . The mean operative time was 60.3 min ( 32–128 ) for LESS and 51.3 min ( 25–120 ) for LC ( p = 0.11 ) . Gallbladder perforation at detachment occurred in 15.69 % of the LESS cases and in 5.88 % of the LC cases ( p = 0.028 ) . The mean VAS score for pain at hour 3 was 2.0 points ( 0–7 ) for the LESS group and 4.0 ( 0–10 ) for the LC group ( p = 0.07 ) , and at postoperative hour 24 it was 0.3 points ( 0–6 ) for LESS and 2.3 ( 0–10 ) for LC ( p = 0.03 ) . There were no significant differences in wound complications . Incisional hernias were not found in either group . Conclusions The LESS single-port ( SP ) operations dem and a bigger incision than LC surgery . However , there were no differences in healing , wound infections , and hernia development . We found a tendency of less postoperative pain associated with LESS/SP than with LC Background This study aim ed to compare the short-term outcomes of single-access laparoscopic cholecystectomy ( SALC ) and conventional laparoscopic cholecystectomy ( CLC ) . Methods In a prospect i ve study , patients with symptomatic cholelithiasis were r and omized to SALC or CLC with follow-up at 1 week , 1 and 6 months . The primary end point of this study was to assess the total outcomes of quality of life using the EuroQoL EQ-5D question naire . The secondary end points were postoperative pain , analgesia requirement and duration of use , operative time , perioperative complications , estimated blood loss , hospital stay , cosmesis outcome , and number of days required to return to normal activities . Results A total of 269 patients were prospect ively r and omized into two groups ( 125 in each group after excluding 19 patients for various reasons ) . The SALC procedure was done safely without intraoperative or major postoperative complications . In four SALC patients , an extra epigastric port was inserted to enhance exposure . There was no open conversion in either group . SALC patients reported better results among four of the EuroQoL EQ-5D dimensions ( mobility , self-care , activity , and pain/discomfort ) at 1 week after surgery , an improved pain profile at 4 , 12 , and 24 h , better cosmetic outcome at 1 and 6 months ( P ≤ 0.01 ) , shorter duration of need for analgesia ( P ≤ 0.02 ) , and earlier return to normal activities ( P ≤ 0.026 ) . Operative times , hospital stay , QOL at 1 and 6 months postoperatively , and estimated blood loss were similar for both procedures . Conclusion This study supports other studies that show that SALC is a feasible and promising alternative to traditional laparoscopic cholecystectomy in selected patients with better cosmesis , QOL , and improved postoperative pain results , and it can be performed with the existing laparoscopic instruments OBJECTIVE To compare the transumbilical technique of laparoscopic cholecystectomy with st and ard laparoscopic cholecystectomy . DESIGN R and omised open study . SETTING Teaching hospital , Italy . SUBJECTS 90 patients who required elective cholecystectomy under general anaesthesia . INTERVENTIONS St and ard laparoscopic cholecystectomy through 4 ports or transumbilical cholecystectomy through 2 ports . MAIN OUTCOME MEASURES Amount of pain and analgesia , cost , side effects , and cosmesis . RESULTS 25 patients were excluded from analysis ( 8 in the st and ard group because relevant data were not recorded ; and 17 in the transumbilical group in 4 of whom relevant data were not recorded , and 13 for technical reasons ) . 32 patients who had st and ard , and 25 who had transumbilical cholecystectomy had operative cholangiograms . There were no complications , no side effects , and no conversions to open cholecystectomy . Those who had transumbilical cholecystectomy had significantly lower pain scores ( p<0.05 ) and required significantly less analgesia during the first 24 hours ( p<0.05 ) than those who had st and ard laparoscopic cholecystectomy . CONCLUSION Once the learning curve has been completed , transumbilical cholecystectomy is possible without some of difficulties associated with st and ard laparoscopic cholecystectomy BACKGROUND Three to four trocars are commonly used when performing laparoscopic cholecystectomy ( LC ) . Subcostal and lateral trocars are used for the grasper to retract the gallbladder . These graspers are seldom extracted or exchanged with other instruments . Based on this , it seems that the subcostal and lateral trocars are of minimal importance . The aim of this study was to evaluate the validity and benefits of performing LC without subcostal and lateral trocars ( LCWSL ) . METHODS From June 2006 to June 2007 , 60 patients diagnosed with gallbladder disease were enrolled in this r and omized , controlled trial to compare the result of LCWSL to conventional LC ( CLC ) . Operation time , complication , pain scale , cosmetic effect , and hospital cost were compared . RESULTS There were no differences in operation time and intra- or postoperative morbidity . Total blood loss , pain , duration until resumption of oral diet , and duration of hospital stay were similar . Total cost of LCWSL was cheaper than CLC by $ 397 USD ( P < 0.05 ) , and total incision length was smaller than CLC by 11 mm . CONCLUSIONS LCWSL seems to be an acceptable procedure , having the same morbidity and better economic and cosmetic outcome , as compared to CLC This three‐armed r and omized clinical trial , with blinding of patients and outcome assessors , tested the hypothesis that single‐port ( SP ) and /or minilaparoscopic ( ML ) cholecystectomy are superior to conventional laparoscopic ( CL ) cholecystectomy . Background : Since the first laparoscopic cholecystectomy ( LC ) was reported in 1990 , it has met with widespread acceptance as a st and ard procedure using four trocars . The fourth ( lateral ) trocar is used to grasp the fundus of the gallbladder so as to expose Calot ’s triangle . It has been argued that the fourth trocar is not necessary in most cases . Therefore , the aim of this study was to compare the three-port vs the four-port technique . Methods : Between 1998 and 2000 , 200 consecutive patients undergoing elective LC for gallstone disease were r and omized to be treated via either the three- or four-port technique . Results : There was no difference between the two groups in age , sex , or weight . In terms of outcome , there was no difference between the two groups in success rate , operating time , number of oral analgesic tablets ( paracetamol ) , visual analogue score , or postoperative hospital stay ; however , the three-port group required fewer analgesic injections ( nalbuphine ) ( 0.4 vs 0.77 , p = 0.024 ) . Conclusion : The three-port technique is as safe as the st and ard four-port one for LC . The main advantages of the three-port technique are that it causes less pain , is less expensive , and leaves fewer scars Background The purpose of this study was to compare the postoperative inflammatory response and severity of pain between single-incision laparoscopic surgery ( SILS ) cholecystectomy and conventional laparoscopic cholecystectomy ( LC ) . Methods Two groups of 20 patients were prospect ively r and omized to either conventional LC or SILS cholecystectomy . Serum interleukin-6 ( IL-6 ) levels were assayed before surgery , at 4–6 h , and at 18–24 h after the procedure . Serum C-reactive protein ( CRP ) levels also were assayed at 18–24 h after surgery . Pain was measured at each of three time points after surgery using the visual analogue scale ( VAS ) . The number of analgesia doses administered in the first 24 h after the procedure also was recorded and 30-day surgical outcomes were documented . Results The groups had equivalent body mass index ( BMI ) , age , and comorbidity distribution . Peak IL-6 levels occurred 4–6 h after surgery , and the median level was 12.8 pg/ml in the LC and 8.9 pg/ml in the SILS group ( p = 0.5 ) . The median CRP level before discharge was 1.6 mg/dl in the LC and 1.9 mg/dl in the SILS group ( p = 0.38 ) . There was no difference in either analgesic use or pain intensity as measured by the VAS between the two groups ( p = 0.72 ) . The length of the surgical procedure was significantly longer in the SILS group ( p < 0.001 ) . No intraoperative complications occurred in either group . Conclusions Single-incision laparoscopic surgery does not significantly reduce systemic inflammatory response , postoperative pain , or analgesic use compared with LC This prospect i ve r and omized study compared single-incision laparoscopic cholecystectomy ( SILC ) and laparoscopic cholecystectomy ( LC ) with respect to estimated blood loss , operative time , postoperative pain levels , and complications . Thirty-four study patients were divided into 2 groups : 17 patients underwent SILC and 17 underwent LC . Operative time was longer for SILC than for LC , and the difference was statistically significant ( P<0.001 ) . There was no statistically significant difference in the relationship of body mass index with operative time between SILC and LC ( P=0.613 , P=0.983 , respectively ) . The 2 groups had no statistically significant differences with respect to visual analog scale scores , estimated blood loss , shoulder pain , or complications ( P>0.05 ) . SILC can be the treatment of choice for gallbladder disease . Although the surgeon ’s first several attempts at SILC require a longer operative time compared with LC , there are no differences in hospital length of stay , blood loss , complication rates , or pain scores between SILC and LC Background Single-incision laparoscopic cholecystectomy ( SILC ) is a newer approach that may be a safe alternative to traditional laparoscopic cholecystectomy ( TLC ) based on retrospective and small prospect i ve studies . As the dem and for single-incision surgery may be driven by patient perceptions of benefits , we design ed a prospect i ve r and omized study using patient-reported outcomes as our end points . Methods Patients deemed c and i date s for either SILC or TLC were offered enrollment in the study . After induction of anesthesia , patients were r and omized to SILC or TLC . Preoperative characteristics and operative data were recorded , including length of stay ( LOS ) . Pain scores in recovery and for 48 h and satisfaction with wound appearance at 2 and 4 weeks were reported by patients . We used the gastrointestinal quality of life index ( GIQLI ) survey preoperatively and at 2 and 4 weeks postoperatively to assess recovery . Procedural and total hospital costs per case were abstract ed from hospital billing systems . Results Mean age of the study group was 44.1 years ( ±14.8 ) , 87 % were Caucasian , and 77 % were female , with no difference between groups . Operative times were longer for SILC ( median = 57 vs. 47 min , p = 0.008 ) , but mean LOS was similar ( 6.8 ± 4.2 h SILC vs. 6.2 ± 4.8 h TLC , p = 0.59 ) . Operating room cost and encounter cost were similar . GIQLI scores were not significantly different preoperatively or at 2 or 4 weeks postoperatively . Patients reported higher satisfaction with wound appearance at 2 weeks with SILC . There were no differences in pain scores in recovery or in the first 48 h , although SILC patients required significantly more narcotic in recovery ( 19 mg morphine equivalent vs. 11.5 , p = 0.03 ) . Conclusions SILC is a longer operation but can be done at the same cost as TLC . Recovery and pain scores are not significantly different . There may be an improvement in patient satisfaction with wound appearance . Both procedures are valid approaches to cholecystectomy Transumbilical single‐incision laparoscopic cholecystectomy ( SILC ) and minilaparoscopic cholecystectomy ( MLC ) are both increasingly being used to treat symptomatic gallstones . The present study compared SILC and MLC with respect to outcome in a prospect i ve r and omized trial Background and study aims Natural orifice transluminal endoscopic surgery ( NOTES ) is a technique still in experimental development whose safety and effectiveness call for assessment through clinical trials . In this paper we present a three-arm , noninferiority , prospect i ve r and omized clinical trial of 1 year duration comparing the vaginal and transumbilical approaches for transluminal endoscopic surgery with the conventional laparoscopic approach for elective cholecystectomy . Patients and methods Sixty female patients between the ages of 18 and 65 years who were eligible for elective cholecystectomy were r and omized in a ratio of 1:1:1 to receive hybrid transvaginal NOTES ( TV group ) , hybrid transumbilical NOTES ( TU group ) or conventional laparoscopy ( CL group ) . The main study variable was parietal complications ( wound infection , bleeding , and eventration ) . The analysis was by intention to treat , and losses were not replaced . Results Cholecystectomy was successfully performed on 94 % of the patients . One patient in the TU group was reconverted to CL owing to difficulty in maneuvering the endoscope . After a minimum follow-up period of 1 year , no differences were noted in the rate of parietal complications . Postoperative pain , length of hospital stay , and time off from work were similar in the three groups . No patient developed dyspareunia . Surgical time was longer among cases in which a flexible endoscope was used ( CL , 47.04 min ; TV , 64.85 min ; TU , 59.80 min ) . Conclusions NOTES approaches using the flexible endoscope are not inferior in safety or effectiveness to conventional laparoscopy . The transumbilical approach with flexible endoscope is as effective and safe as the transvaginal approach and is a promising , single-incision approach Background and Aims Acute-phase proteins and inflammatory cytokines mediate measurable responses to surgical trauma , which are proportional to the extent of tissue injury and correlate with post-operative outcome . By comparing systemic stress following multi-port ( LC ) and single-incision laparoscopic cholecystectomy ( SILC ) , we aim to determine whether reduced incision size induces a reduced stress response . Methods Thirty-five consecutive patients were included , 11 underwent SILC ( mean ± SEM ; age 44.8 ± 3.88 year ; BMI 27 ± 1.44 kg/m2 ) and 24 underwent LC ( 56.17 ± 2.80 year ; 31.72 ± 1.07 kg/m2 , p < 0.05 ) . Primary endpoint measures included levels of interleukin-6 and C-reactive protein measured pre- and post-operatively . Length-of-stay ( LOS ) and postoperative morbidity were secondary endpoints . Results No statistically significant differences were found between SILC and LC for interleukin-6 and C-reactive protein levels , LOS and duration of surgery . There was also no correlation between systemic stress response and operative parameters . There were no intra-operative complications . ConclusionS ILC appears to be a safe , feasible technique with potential advantages of cosmesis , reduced incisional pain , and well-being recommending its use . These data indicate no difference in systemic stress and morbidity between SILC and LC . A larger , multi-centred , r and omised prospect i ve trial is warranted to further investigate and confirm this finding BACKGROUND Several studies have reported faster recoveries , lower pain scores , and superior cosmetic results after mini-laparoscopic cholecystectomy ( MLC ) . The purpose of this study was to perform a r and omized controlled trial , comparing MLC with conventional laparoscopic cholecystectomy ( LC ) . SUBJECTS AND METHODS Forty-one patients with symptomatic cholecystolithiasis were r and omized between the two groups : 23 having undergone LC and 18 MLC . The primary end point was postoperative pain , which was evaluated during the first 24 hours postoperatively , using the numerical rating scale . Patient satisfaction with the cosmetic result was evaluated after 1 month . RESULTS The two groups were comparable concerning age , sex , and body mass index . The median operating time ( 42 minutes versus 45 minutes ; P=.386 ) , complication rate , and duration of hospital stay ( 2 days ; P=.611 ) were similar in both groups . The level of postoperative pain was analogous at every time . There was no difference in the analgesic requirements or cosmesis . CONCLUSIONS MLC showed similar results concerning postoperative pain and did not lead to a greater patient satisfaction with the cosmetic result , compared with LC . MLC did not take longer to perform , nor was it associated with major complications or a high conversion rate . MLC is a safe and feasible technique for the treatment of gallbladder disease in elective patients Background : Single-incision laparoscopic surgery may reduce the complications of port site and postoperative pain . The improved cosmetic result also may improve the satisfaction of patients who have undergone surgery . Methods : The study enrolled 108 patients who consecutively underwent laparoscopic cholecystectomy by the same surgeons and r and omly divided them into single-incision laparoscopic cholecystectomy ( SILC ) and conventional laparoscopic cholecystectomy ( CLC ) groups . Demographic data and short-term operative outcomes were collected and compared . Results : A total of 57 and 51 patients received SILC and CLC , respectively , from May to August 2010 at our institution . No significant difference was found with respect to demographic data including age , sex , and body mass index between the 2 groups . Similarly , short-term operative outcomes such as postoperative complications , length of stay , and visual analog pain score did not differ between the 2 groups . However , the incision of SILC ( 21.6±2.4 ) was shorter than that of CLC ( 30.8±2.6 ) ( P=0.032 ) . Conclusions : SILC seems to be a safe and feasible technique . It can be undertaken without the expense of added postoperative complication and operative time and provides patients with a minimal apparent scar BACKGROUND / AIMS Since the first successful laparoscopic cholecystectomy with the establishment of pneumoperitoneum in France by Mouret in 1987 , it has become the golden st and ard for cholecystectomy . Generally techniques with four trocars have been used with surgeons but some of them prefer 3-trocar techniques . Our aim is to compare the clinical outcomes of three- and four-port techniques prospect ively . METHODOLOGY Between 1998 and 2003 , one hundred and forty-six consecutive patients who underwent elective laparoscopic cholecystectomy for cholelithiasis in the Medical Faculty of Suleyman Demirel University were r and omized to receive either the three-port or the four-port technique . Operative time , ( time from the beginning of the insufflation up to the closure of the skin ) , success rate , visual analogue pain score , analgesia requirements , postoperative hospital stay were compared . RESULTS No differences between the two groups could be found . CONCLUSIONS Three-port technique is safe , effective , and economic but does not reduce the overall pain score and analgesia requirement Since its introduction in 1987 , the technique of cholecystectomy has continued to undergo evolution . Surgeons have reduced the port size and number or both to achieve improvement in postoperative pain control , rapid return to activity and better cosmetic results . Therefore , this study was done to compare the st and ard 4 port laparoscopic cholecystectomy ( LC ) with the 3 port laparoscopic cholecystectomy using a 5 mm telescope instead of 10 mm telescope ( mini laparoscopic cholecystectomy - MLC ) . Forty patients were r and omised to each group . Mean operating time , intraoperative and postoperative complications , mean period to resume walking , eating and return to normal activities and mean hospital stay were similar in the two groups . The level of postoperative pain was significantly lower in the MLC group . Patients who underwent MLC required a significantly lower dose of analgesics . In conclusion mini laparoscopic cholecystectomy is a feasible and safe procedure with less postoperative pain and better cosmesis and without increased complications Overwhelming evidence shows the quality of reporting of r and omised controlled trials ( RCTs ) is not optimal . Without transparent reporting , readers can not judge the reliability and validity of trial findings nor extract information for systematic review s. Recent method ological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects . Such systematic error is seriously damaging to RCTs , which are considered the gold st and ard for evaluating interventions because of their ability to minimise or avoid bias . A group of scientists and editors developed the CONSORT ( Consoli date d St and ards of Reporting Trials ) statement to improve the quality of reporting of RCTs . It was first published in 1996 and up date d in 2001 . The statement consists of a checklist and flow diagram that authors can use for reporting an RCT . Many leading medical journals and major international editorial groups have endorsed the CONSORT statement . The statement facilitates critical appraisal and interpretation of RCTs . During the 2001 CONSORT revision , it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports . A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement . After an expert meeting in January 2007 , the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement . This up date improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition , such as selective outcome reporting bias . This explanatory and elaboration document-intended to enhance the use , underst and ing , and dissemination of the CONSORT statement-has also been extensively revised . It presents the meaning and rationale for each new and up date d checklist item providing examples of good reporting and , where possible , references to relevant empirical studies . Several examples of flow diagrams are included . The CONSORT 2010 Statement , this revised explanatory and elaboration document , and the associated website ( www.consort-statement.org ) should be helpful re sources to improve reporting of r and omised trials Published evidence suggests that aspects of trial design lead to biased intervention effect estimates , but findings from different studies are inconsistent . This study combined data from 7 meta-epidemiologic studies and removed overlaps to derive a final data set of 234 unique meta-analyses containing 1973 trials . Outcome measures were classified as " mortality , " " other objective , " " or subjective , " and Bayesian hierarchical models were used to estimate associations of trial characteristics with average bias and between-trial heterogeneity . Intervention effect estimates seemed to be exaggerated in trials with inadequate or unclear ( vs. adequate ) r and om-sequence generation ( ratio of odds ratios , 0.89 [ 95 % credible interval { CrI } , 0.82 to 0.96 ] ) and with inadequate or unclear ( vs. adequate ) allocation concealment ( ratio of odds ratios , 0.93 [ CrI , 0.87 to 0.99 ] ) . Lack of or unclear double-blinding ( vs. double-blinding ) was associated with an average of 13 % exaggeration of intervention effects ( ratio of odds ratios , 0.87 [ CrI , 0.79 to 0.96 ] ) , and between-trial heterogeneity was increased for such studies ( SD increase in heterogeneity , 0.14 [ CrI , 0.02 to 0.30 ] ) . For each characteristic , average bias and increases in between-trial heterogeneity were driven primarily by trials with subjective outcomes , with little evidence of bias in trials with objective and mortality outcomes . This study is limited by incomplete trial reporting , and findings may be confounded by other study design characteristics . Bias associated with study design characteristics may lead to exaggeration of intervention effect estimates and increases in between-trial heterogeneity in trials reporting subjectively assessed outcomes
11,047
26,558,665
The authors found a statistically significant reduction in blood loss and transfusion rates when using topical tranexamic acid in primary TKA . Furthermore , the currently available evidence does not support an increased risk of deep venous thrombosis or pulmonary embolism due to tranexamic acid administration . Topical tranexamic acid was effective for reducing postoperative blood loss and transfusion requirements without increasing the prevalence of thromboembolic complications
There has been much debate and controversy about the safety and efficacy of the topical use of tranexamic acid in primary total knee arthroplasty ( TKA ) . The purpose of this study was to perform a meta- analysis to evaluate whether there is less blood loss and lower rates of transfusion after topical tranexamic acid administration in primary TKA .
Background Recently , a number of studies using intra-articular application of tranexamic acid ( IA-TXA ) , with different dosage and techniques , successfully reduced postoperative blood loss in total knee replacement ( TKR ) . However , best of our knowledge , the very low dose of IA-TXA with drain clamping technique in conventional TKR has not been yet studied . This study aim ed to evaluate the effectiveness and dose-response effect of two low-dose IA-TXA regimens in conventional TKR on blood loss and blood transfusion reduction . Methods Between 2010 and 2011 , a triple-blinded r and omized controlled study was conducted in 135 patients undergoing conventional TKR . The patients were allocated into three groups according to intra-articular solution received : Control group ( physiologic saline ) , TXA-250 group ( TXA 250 mg ) , and TXA-500 group ( TXA 500 mg ) . The solution was injected after wound closure followed by drain clamping for 2 hours . Blood loss and transfusion were recorded . Duplex ultrasound was performed . Functional outcome and complication were followed for one year . Results There were forty-five patients per groups . The mean total hemoglobin loss was 2.9 g/dL in control group compared with 2.2 g/dL in both TXA groups ( p > 0.001 ) . Ten patients ( 22 % , control ) , six patients ( 13 % , TXA-250 ) and none ( TXA-500 ) required transfusion ( p = 0.005 ) . Thromboembolic events were detected in 7 patients ( 4 controls , 1 TXA-250 , and 2 TXA-500 ) . Functional outcome was non-significant difference between groups . Conclusions Combined low-dose IA-TXA , as 500 mg , with 2-hour clamp drain is effective for reducing postoperative blood loss and transfusion in conventional TKR without significant difference in postoperative knee function or complication . Trial registration Clinical Trials.gov NCT01850394 Abstract Purpose To compare the blood loss and the blood transfusion between a control group and a group of patients following either a local administration of tranexamic acid or a mechanical post-operative knee flexion , a controlled r and omized study was performed . Methods Sixty patients affected by primary knee osteoarthritis and c and i date s to receive a primary unilateral total knee arthroplasty were enrolled in a prospect i ve , r and omized , controlled study . Exclusion criteria were the following : tranexamic acid allergy , the use of pharmacological anticoagulant therapy , previous knee surgery and renal failure . For each patient , the following parameters were investigated : the blood loss volume , the haemoglobin and haematocrit concentrations and the blood transfusion needs . Results Compared to the control group , the administration of systemic tranexamic acid significantly reduces ( p < 0.05 ) both the blood loss ( average reduction 39.8 % ) and the blood transfusion needs ( 64 % ) . Furthermore , the tranexamic acid group shows a significant reduction ( p < 0.05 ) compared to the knee flexion group of the blood loss ( average reduction 31.8 % ) and the transfusion needs ( 65 % ) . However , even if the knee flexion technique slightly reduces the blood loss ( average reduction 11.6 % ) compared to the control group , this difference is not statistically significant ( n.s . ) . Moreover , this treatment did not reduce the transfusion needs compared to the control group ( n.s . ) . Incidence of complications was not influenced by any of the treatments . Conclusions The use of tranexamic acid compared to knee flexion and to control group significantly reduces blood loss and transfusion needs , without wound complications or symptomatic deep vein thrombosis . Level of evidence Prospect i ve therapeutic study , Level The aim of the present study was to investigate aspects of coagulation and fibrinolysis during knee arthroplasties in order to find out . 1 . whether an increased fibrinolysis is correlated to an increased blood loss 2 . whether there is a difference in markers for coagulation and fibrinolysis in peripheral venous blood compared to those in blood from the wounds 3 . whether the administration of tranexamic acid modifies the fibrinolytic response . Twenty-four patients were included . Twelve patients were given tranexamic acid intravenously at the end of the operation . The dose was repeated three hours later . The other 12 patients were given an equivalent amount of placebo . The administration was r and omised and double-blind . Levels of prothrombin fragments 1 + 2 , D-dimers , plasminogen , alpha 2-antiplasmin , tissue plasminogen activator ( tPA ) , and plasminogen activator inhibitor ( PAI-1 ) in venous blood were investigated just before the operation , at the end of the operation and three hours later . At the end of the operation blood for analysis was also drawn from the wound . Coagulation and fibrinolysis was activated during and after surgery . The activation was significantly higher in blood from the wounds than in peripheral venous blood . We found no direct correlation between the degree of fibrinolysis and blood loss . The administration of tranexamic acid reduced fibrinolysis in the wounds but not in peripheral venous blood . The postoperative blood loss was reduced by half The ideal method of providing tranexamic acid ( TXA ) for decreasing hemoglobin drop after TKA is still controversial . In this clinical trial , 200 patients were r and omly allocated to four groups . In group 1,500 mg TXA was administered intravenously . In group 2 , the joint irrigated with 3 g of TXA in 100 cc of saline . In group 3 , 1.5 g of TXA was injected through the drain . Group 4 did not take TXA . Albeit all methods had a statistical effect on hemoglobin drop , drainage and number of transfused units when compared to controls , but intravenous injection of TXA seems to be much more effective in terms of reducing hemoglobin drop and transfused units ; and what 's more TXA injection by drain is more effective regarding to reducing postoperative drainage BACKGROUND Total knee arthroplasty ( TKA ) is often carried out using a tourniquet and shed blood is collected in drains . Tranexamic acid decreases the external blood loss . Some blood loss may be concealed , and the overall effect of tranexamic acid on the haemoglobin ( Hb ) balance is not known . METHODS Patients with osteoarthrosis had unilateral cemented TKA using spinal anaesthesia . In a double-blind fashion , they received either placebo ( n=24 ) or tranexamic acid 10 mg kg(-1 ) ( n=27 ) i.v . just before tourniquet release and 3 h later . The decrease in circulating Hb on the fifth day after surgery , after correction for Hb transfused , was used to calculate the loss of Hb in grams . This value was then expressed as ml of blood loss . RESULTS The groups had similar characteristics . The median volume of drainage fluid after placebo was 845 ( interquartile range 523 - 990 ) ml and after tranexamic acid was 385 ( 331 - 586 ) ml ( P<0.001 ) . Placebo patients received 2 ( 0 - 2 ) units and tranexamic acid patients 0 ( 0 - 0 ) units of packed red cells ( P<0.001 ) . The estimated blood loss was 1426 ( 1135 - 1977 ) ml and 1045 ( 792 - 1292 ) ml , respectively ( P<0.001 ) . The hidden loss of blood ( calculated as loss minus drainage volume ) was 618 ( 330 - 1347 ) ml and 524 ( 330 - 9620 ) ml , respectively ( P=0.41 ) . Two patients in each group developed deep vein thrombosis . CONCLUSIONS Tranexamic acid decreased total blood loss by nearly 30 % , drainage volume by approximately 50 % and drastically reduced transfusion . However , concealed loss was only marginally influenced by tranexamic acid and was at least as large as the drainage volume Purpose The surgical stress of total knee arthroplasty ( TKA ) procedure and the application of intra-operative pneumatic thigh tourniquet increases local fibrinolytic activity , which contributes significantly to post-operative blood loss . Tranexamic acid , an antifibrinolytic drug , is commonly used to control post-operative blood loss . The recommended mode of administration of tranexamic acid is either oral or intravenous . However , the mechanism of action of the tranexamic acid points towards the possible effectiveness it may have following local/intra-articular application . This prospect i ve , double-blinded , r and omized preliminary study evaluated the efficacy of intra-articular tranexamic acid in reducing TKA-associated post-operative blood loss . Methods Fifty consenting patients with osteoarthritis of the knee scheduled for primary unilateral cemented-TKA were r and omly allocated to one of the two groups : Tranexamic Acid ( TA ) group ( n = 25 , 500 mg/5 ml tranexamic acid ) and the control group ( n = 25 , 5 ml 0.9 % saline ) . The drug and control solution were administered intra-articularly through the drain tube immediately after the wound closure . Parameters related to blood loss ( drop in haemoglobin , haematocrit differential ) and the drain output [ volume ( ml ) ] were compared between the two groups . Results On a comparative basis , TA-group obtained significant reduction in the drain output [ 95 % CI : 360.41–539.59 , p < 0.001 ] at 48 h post-operatively . Even though the control group received sixfold more blood transfusion than TA-group , it showed a greater drop in haemoglobin and haematocrit ( p < 0.05 ) . Conclusions Local application of tranexamic acid seems to be effective in reducing post-TKA blood loss as well as blood transfusion requirements . Level of evidence Therapeutic study , Level II Background The antifibrinolytic tranexamic acid reduces surgical blood loss , but studies have not identified an optimal regimen . Questions / purpose sWe studied different dosages , timings , and modes of administration to identify the most effective regimen of tranexamic acid in achieving maximum reduction of blood loss in TKA . Methods We prospect ively studied five regimens ( four intravenous , one local ; 40 patients each ) with a control group ( no tranexamic acid ) . The four intravenous ( 10-mg/kg dose ) regimens included ( 1 ) intraoperative dose ( IO ) given before tourniquet deflation , ( 2 ) additional preoperative dose ( POIO ) , ( 3 ) additional postoperative dose ( IOPO ) , and ( 4 ) all three doses ( POIOPO ) . The fifth regimen was a single local application ( LA ) . Two independent parameters of drain loss and total blood loss , calculated by the hemoglobin balance method , were evaluated statistically . Results Both parameters were reduced in all five regimens as against the control . A significant reduction in drain loss was seen in the POIO , IOPO , and POIOPO groups whereas total blood loss was significantly reduced in the POIO , POIOPO , and LA groups . The POIOPO group had the least drain loss ( 303 mL ) and least total blood loss ( 688 mL ) . The IO group had the greatest drain loss and the IOPO group the greatest total blood loss . Conclusions Single-dose tranexamic acid did not give effective results . The two-dose regimen of POIO was the least amount necessary for effective results . When compared against the control , this regimen produced reduction of drain loss and total blood loss , whereas the IOPO regimen did not . The three-dose regimen of POIOPO produced maximum effective reduction of drain loss and total blood loss . Level of Evidence Level I , therapeutic study . See Guidelines for Authors for a complete description of levels of evidence A high-dose local tranexamic acid has been introduced in total knee arthroplasty for bleeding control . We are not sure about the systemic absorption and side effects . The aim of this study was to evaluate the effect of low dosage of intra-articular tranexamic acid injection combined with 2-hour clamp drain in minimally bleeding computer-assisted surgery total knee replacement ( CAS-TKR ) . A prospect i ve r and omized controlled trial was conducted in a total of 48 patients underwent CAS-TKR . The patients were r and omly assigned to receive either of a mixed intra-articular solution of tranexamic acid 250 mg with physiologic saline ( TXA group ) , or physiologic saline ( control group ) and then followed by clamp drain for 2 hours . Postoperative blood loss was measured by three different methods as drainage volume , total hemoglobin loss and calculated total blood loss . Transfusion requirement and postoperative complications were recorded . All patients were screened for deep vein thrombosis and the functional outcomes were evaluated at 6 months after surgery . The mean postoperative drainage volume , total hemoglobin loss and calculated total blood loss in TXA group were 308.8 mL , 2.1 g/dL and 206.3 mL compared to 529.0 mL , 3.0 g/dL and 385.1 mL in the control group ( P=0.0003 , 0.0005 and < 0.0001 respectively ) . Allogenic blood transfusion was needed for one patient ( 4.2 % ) in TXA group and for eight patients ( 33.3 % ) in the control group . Postoperative knee scores were not significantly different between groups . No deep vein thrombosis , infection or wound complication was detected in both groups . In this study , low dose intra-articular tranexamic acid injection combined with 2-hour clamping drain was effective for reducing postoperative blood loss and transfusion requirement in CAS-TKR without significant difference in postoperative complications or functional outcomes Three hundred and thirty orthopaedic surgeons in the United States participated in a study of transfusion requirements associated with total joint arthroplasty . A total of 9482 patients ( 3920 patients who had a total hip replacement and 5562 patients who had a total knee replacement ) were evaluated prospect ively from September 1996 through June 1997 . Of those patients , 4409 ( 46 percent [ 57 percent of the patients who had a hip replacement and 39 percent of the patients who had a knee replacement ] ) had a blood transfusion . Two thous and eight hundred and ninety patients ( 66 percent ) received autologous blood , and 1519 patients ( 34 percent ) received allogenic blood . Ordered logistic regression analysis showed the most important predictors of the transfusion of allogenic blood to be a low baseline hemoglobin level and a lack of predonated autologous blood . Preoperative donation of autologous blood decreases the risk of transfusion of allogenic blood ; however , inefficiencies in the procedures for obtaining autologous blood were identified . Sixty-one percent ( 5741 ) of the patients had predonated blood for autologous transfusion , but 4464 ( 45 percent ) of the 9920 units of the predonated autologous blood were not used . Primary procedures and revision total knee arthroplasty were associated with the greatest number of wasted autologous units . Of the 5741 patients who had predonated blood , 503 ( 9 percent ) needed a transfusion of allogenic blood . The frequency of allogenic blood transfusion varied with respect to the type of operative procedure ( revision total hip arthroplasty and bilateral total knee arthroplasty were associated with the highest prevalence of such transfusions ) and with a baseline hemoglobin level of 130 grams per liter or less . Transfusion of allogenic blood was also associated with infection ( p < or = 0.001 ) , fluid overload ( p < or = 0.001 ) , and increased duration of hospitalization ( p < or = 0.01 ) . These latter findings warrant further evaluation in controlled studies Computer-assisted surgery ( CAS ) in total knee arthroplasty ( TKA ) could be useful in reducing the overall blood loss . A prospect i ve r and omised study was performed with two groups of 50 patients each of whom were treated for knee arthritis . Patients of group A were treated by a conventional st and ard procedure , while for patients of group B a specific CAS procedure was used . We determined the intraoperative blood loss according to the Orthopaedic Surgery Transfusion Haemoglobin European Overview ( OSTHEO ) study . The average blood loss in patients of group A was 1,974 ml ( range : 450–3,930 ml ) compared to 1,677 ml of patients of group B ( range : 500–2,634 ml ) . A statistically significant difference was found between the two groups ( p = 0.0283 ) . Computer-assisted surgery is highly recommended in TKR to save blood . It creates more possibilities to operate on anaemic patients and subjects who can not accept blood products by reducing blood loss risk . RésuméLa chirurgie assistée par ordinateur dans les prothèses totales du genou peut être utile afin de réduire les pertes sanguines . Matériel et méthode : une étude prospect i ve , r and omisée , a été réalisée avec deux groupes de 50 patients , chacun , traités pour une arthrose du genou . Les patients du groupe A ont été opérés sur un mode conventionnel alors que l’autre groupe l’a été avec assistance informatique dans le cadre de la navigation . Nous avons déterminé les perte sanguines per-opératoires selon l’évaluation OSTEO . Résultats : le total des pertes sanguines du groupe A a été de 1974 ml en moyenne ( de 450 à 3930 ) en comparaison des 1677 ml des patients du groupe B ( de 500 à 2634 ml ) . La différence est statistiquement significative entre les deux groupes , p = 0,0283 ) . Conclusion : la navigation avec chirurgie assistée par ordinateur est recomm and ée pour les prothèses de remplacement articulaire afin de réduire les pertes sanguines . Cette technique donne plus de possibilité pour opérer les sujets anémiques qui ne peuvent supporter de grosses pertes sanguines Purpose This is a r and omised controlled trial to examine whether intra-articular injection of tranexamic acid ( TXA ) decreases blood loss , as well as reducing leg swelling after total knee arthroplasty ( TKA ) . Methods We performed 100 TKA in osteoarthritis patients . At closure , a total of 2,000 mg/20 ml TXA was injected into the knee joint through a closed suction drain ( TXA group ) . For the control group , the same volume of physiological saline was injected . The pre-operative condition of the patients , post-operative haemoglobin ( Hb ) levels , discharge volumes from drain , D-dimer and needs for transfusion were compared between these two groups . Furthermore , leg diameters ( thigh , suprapatellar portion and calf girth ) were measured pre- and post-operatively to investigate whether TXA has an influence on leg swelling after surgery . Results The results revealed that post-operative decrease in Hb level was significantly reduced in the TXA group . Furthermore , knee joint swelling after operation was significantly suppressed in the TXA group compared to the control group . Conclusions The results revealed intra-articular administration of TXA decreased not only blood loss , but also knee joint swelling after TKA Abstract The objective of this r and omized controlled trial was to evaluate the efficacy and safety of intra-articular injections of tranexamic acid ( TXA ) on perioperative blood loss and transfusion in primary unilateral total knee arthroplasty ( TKA ) without drainage . Primary TKA was performed on a total of 80 patients ( 80 knees ) affected to various degrees by knee osteoarthritis . The patients were r and omized to receive 500 mg of TXA in 20 mL of normal saline solution ( n = 40 ) or an equivalent volume of normal saline solution ( n = 40 ) , applied into the joint for 5 min at the end of surgery . Data on routine blood examination , blood loss and blood transfusion after TKA were compared between the two groups . The results showed no significant difference between the two groups in intra-operative blood loss ( P = 0.136 ) . The mean postoperative visible blood loss , hidden blood loss and transfusion requests were significantly different between the two groups ( P < 0.05 ) . The values of postoperative hemoglobin and hematocrit were lower in the control group compared with those in the treatment group ( P < 0.05 ) . No deep vein thrombosis was detected through Doppler ultrasound examination . Three hour postoperative D-dimer in the control group was higher than the treatment group ( P = 0.02 ) . There was no statistically significant difference between the coagulation indicators and range of motion in the two groups . We conclude that intra-articular TXA in patients undergoing unilateral TKA could significantly reduce postoperative blood loss and blood transfusion and avoid perioperative anemia-related complications without increased risk of venous thrombosis . Level of evidence Level I. Therapeutic study BACKGROUND In total knee arthroplasty surgery , a blood conservation program is applied as a normal clinical practice to avoid allogenic transfusions . The objective of this study was to assess the effectiveness of tranexamic acid to reduce transfusions in total knee replacement even when a blood conservation program is applied . STUDY DESIGN AND METHODS In a double-blind prospect i ve study the patients scheduled for total knee arthroplasty were included in a well-established blood conservation program and then r and omly assigned into two groups : In tranexamic acid group , 10 mg per kg ev bolus followed by 1 mg per kg per hour perfusion was administered , while in the control group , saline was given matching the protocol . RESULTS Ninety-five patients were included ( tranexamic acid group , 46 ; control group , 49 ) . Thirty-three patients ( 34.7 % ) underwent preoperative procedures to reduce transfusions : presurgical autologous blood donation ( 12 ) , recombinant erythropoietin ( 6 ) , and elementary iron ( 15 ) ; postoperative drain for reinfusion was allocated in all the cases . Total blood loss on the fourth postoperative day was [ mean ( + /-SD ) ] 1744 ( + /-804 ) mL in controls compared with 1301 ( + /-621 ) mL in the tranexamic acid group ( p < 0.05 ) . Eleven units of blood were transfused ( 6 patients ) in the control group versus one in the tranexamic acid group ( p < 0.05 ) . Only 2 patients ( 4 % ) in the tranexamic acid group received reinfusion of blood recovered by drains compared with 36 ( 73 % ) in the control group ( p < 0.0001 ) . No thromboembolic complications were detected . CONCLUSION Tranexamic acid reduces blood losses and transfusion requirements even when a blood conservation program was used and it questions the usefulness of the postoperative reinfusion drains Major blood loss is a known potential complication in total hip and total knee arthroplasty . We conducted a prospect i ve , stratified , r and omized , double-blind , placebo-controlled trial that evaluated 100 patients undergoing total knee or total hip arthroplasty to evaluate the effect on blood loss using the topical application of tranexamic acid . Participants received either 2 g of topical tranexamic acid or the equivalent volume of placebo into the joint prior to surgical closure . Tranexamic acid result ed in a lower mean maximum decline in postoperative hemoglobin levels when compared to placebo ( P = 0.013 ) . Patients in the tranexamic acid group demonstrated an improved but non-significant reduction in the units of blood transfused compared to placebo ( P = 0.423 ) . There was no clinical ly significant increase in complications in the tranexamic acid group , including no incidence of venous thromboembolism OBJECTIVE To evaluate the effect of locally applied tranexamic acid on postoperative blood loss and measures of fibrinolysis in drained blood . DESIGN Prospect i ve study . SETTING University hospital , Norway . PATIENTS 30 patients operated on for low back pain by screw fixation of the lumbar spine , 16 of who were r and omised to be given topical tranexamic acid . MAIN OUTCOME MEASURES Postoperative blood loss after 18 hours . Concentrations of plasmin/alpha2-antiplasmin ( PAP ) and D-dimer in arterial and drained blood at the time of wound closure and in drained blood after 1 hour . RESULTS In the tranexamic group median ( interquartile ) blood loss was reduced by half from 525 ( 325 - 750 ) ml to 252 ( 127 - 465 ) ml , p = 0.02 . In drained blood after one hour the increase in the concentration of PAP was 150 (109 - 170)% and D-dimer 150 (107 - 272)% in the tranexamic group compared with the control group where the increase in PAP was 320 (140 - 540)% and D-dimer 260 (161 - 670)% . CONCLUSION Tranexamic acid applied in the wound inhibits blood loss by up to a half in major orthopaedic surgery probably because it prevents excessive fibrinolysis Tranexamic acid ( TNA ) reduces postoperative blood loss in general and obstetrical surgery but there is limited orthopaedic literature regarding its use in the topical setting . To study the effect of topical TNA after primary total knee arthroplasty ( TKA ) , 101 patients were r and omized to topical administration of 2.0 g TNA in 75mL of normal saline ( 50 patients ) or placebo ( 51 patients ) . Operative technique , drug administration , and venous thromboembolism prophylaxis were st and ardized . All patients underwent screening ultrasound of the operative extremity . Total blood loss was lower in the TNA group ( 940.2±327.1mL ) than the placebo group (1293.1±532.7mL)(P<0.001 ) , and four patients in the placebo group and none in the TNA group received postoperative transfusion ( P=0.118 ) . We recommend administration of topical TNA in primary TKA in healthy patients to decrease perioperative blood loss The application of a pneumatic tourniquet in orthopedic procedures enhances local fibrinolysis . Consequently , a short-term antifibrinolytic therapy may be indicated in this clinical situation to reduce postoperative blood loss . The purpose of this prospect i ve doubleblind study was to investigate the effect of tranexamic acid ( TA ) on blood loss associated with total knee arthroplasty ( TKA ) . Seventy-five patients scheduled for 77 TKAs were r and omized to receive either TA ( n = 39 ) or equal volume of normal saline ( NS , n = 38 ) . Before deflation of the tourniquet , 15 mg/kg of TA was given intravenously followed by two 10-mg/kg additional doses . Perioperative blood loss gathered in surgical gauzes , suction reservoirs , and postoperative drainage system was measured . The number of transfusions given during hospitalization was registered . Total blood loss ( mean + /- SD ) was 689 + /- 289 mL in the TA group and 1509 + /- 643 mL in the NS group ( P < 0.0001 ) . The mean number of transfused red cell units in the TA group was 1.0 + /- 1.2 compared to 3.1 + /- 1.6 in the NS group ( P < 0.0001 ) . Twenty-two patients in the TA group and four patients in the NS group were treated without transfusion ( P < 0.00003 ) . Two patients in the TA group and three in the NS group had a deep venous thrombosis , including a fatal case of pulmonary embolism in the NS group . We conclude that short-term TA therapy significantly reduces TKA-associated blood loss and transfusion requirements without increasing thromboembolic complications . ( Anesth Analg 1997;84:839 - 44 BACKGROUND Topical application of tranexamic acid to bleeding wound surfaces reduces blood loss in patients undergoing some major surgeries , without systemic complications . The objective of the present trial was to assess the efficacy and safety of the topical application of tranexamic acid on postoperative blood loss in patients undergoing primary unilateral total knee arthroplasty with cement . METHODS In a prospect i ve , double-blind , placebo-controlled trial , 124 patients were r and omized to receive 1.5 or 3.0 g of tranexamic acid in 100 mL of normal saline solution or an equivalent volume of placebo ( normal saline solution ) applied into the joint for five minutes at the end of surgery . The primary outcome was blood loss calculated from the difference between the preoperative hemoglobin level and the corresponding lowest postoperative value or hemoglobin level prior to transfusion . The safety outcomes included Doppler ultrasound in all patients and measurement of plasma levels of tranexamic acid one hour after release of the tourniquet . RESULTS Twenty-five patients were withdrawn for various reasons ; therefore , ninety-nine patients were included in the intention-to-treat analysis . The postoperative blood loss was reduced in the 1.5 and 3-g tranexamic acid groups ( 1295 mL [ 95 % confidence interval , 1167 to 1422 mL ] and 1208 mL [ 95 % confidence interval , 1078 to 1339 mL ] , respectively ) in comparison with the placebo group ( 1610 mL [ 95 % confidence interval , 1480 to 1738 mL ] ) ( p < 0.017 ) . The postoperative hemoglobin levels were higher in the 1.5 and 3.0-g tranexamic acid groups ( 10.0 g/dL [ 95 % confidence interval , 9.5 to 10.4 g/dL ] and 10.1 g/dL [ 95 % confidence interval , 9.8 to 10.5 g/dL ] , respectively ) in comparison with the placebo group ( 8.6 g/dL [ 95 % confidence interval , 8.2 to 9 g/dL ] ) ( p < 0.017 ) . With the numbers studied , there was no difference in the rates of deep-vein thrombosis or pulmonary embolism between the three groups . Minimal systemic absorption of tranexamic acid was observed . CONCLUSIONS At the conclusion of a total knee arthroplasty with cement , topical application of tranexamic acid directly into the surgical wound reduced postoperative bleeding by 20 % to 25 % , or 300 to 400 mL , result ing in 16 % to 17 % higher postoperative hemoglobin levels compared with placebo , with no clinical ly important increase in complications being identified in the treatment groups Background : Extensive blood loss in total knee replacement ( TKR ) surgery is well known and is associated with a high transfusion rate of allogenic blood . Tranexamic acid ( TXA ) has been shown to reduce blood loss by 50 % in this patient group , but only in cases with a perioperative loss of 1400–1800 ml . This study was performed to see if TXA offers any advantages in knee replacement surgery with blood loss at 800 ml The purpose of this study was to compare the efficacy of topical Tranexamic Acid ( TXA ) versus Intravenous ( IV ) Tranexamic Acid for reduction of blood loss following primary total knee arthroplasty ( TKA ) . This prospect i ve r and omized study involved 89 patients comparing topical administration of 2.0 g TXA , versus IV administration of 10mg/kg . There were no differences between the two groups with regard to patient demographics or perioperative function . The primary outcome measure , perioperative change in hemoglobin level , showed a decrease of 3.06 ± 1.02 in the IV group and 3.42 ± 1.07 in the topical group ( P = 0.108 ) . There were no statistical differences between the groups in preoperative hemoglobin level , lowest postoperative hemoglobin level , or total drain output . One patient in the topical group required blood transfusion ( P = 0.342 ) . Based on our study , topical Tranexamic Acid has similar efficacy to IV Tranexamic Acid for TKA patients BACKGROUND Approximately one-third of patients undergoing total knee replacement require one to three units of blood postoperatively . Tranexamic acid ( TXA ) is a synthetic antifibrinolytic agent that has been successfully used intravenously to stop bleeding after total knee replacement . A topical application is easy to administer , provides a maximum concentration of tranexamic acid at the bleeding site , and is associated with little or no systemic absorption of the tranexamic acid . METHODS A double-blind , r and omized controlled trial of 157 patients undergoing unilateral primary cemented total knee replacement investigated the effect of topical ( intra-articular ) application of tranexamic acid on blood loss . The primary outcome was the blood transfusion rate . Secondary outcomes included the drain blood loss , hemoglobin concentration drop , generic quality of life ( EuroQol ) , Oxford Knee Score , length of stay , a cost analysis , and complications as per the protocol definitions . RESULTS Tranexamic acid reduced the absolute risk of blood transfusion by 15.4 % ( 95 % confidence interval [ CI ] , 7.5 % to 25.4 % ; p = 0.001 ) , from 16.7 % to 1.3 % , and reduced blood loss by 168 mL ( 95 % CI , 80 to 256 mL ; p = 0.0003 ) , the length of stay by 1.2 days ( 95 % CI , 0.05 to 2.43 days ; p = 0.041 ) , and the cost per episode by £ 333 ( 95 % CI , £ 37 to £ 630 ; p = 0.028 ) . ( In 2008 , £ 1 = 1.6 U.S. dollars . ) Oxford Knee Scores and EuroQol EQ-5D scores were similar at three months . CONCLUSIONS Topically applied tranexamic acid was effective in reducing the need for blood transfusion following total knee replacement without important additional adverse effects . LEVEL OF EVIDENCE Therapeutic level I. See Instructions for Authors for a complete description of levels of evidence
11,048
25,084,252
There were no statistically significant differences in overall mortality , recurrent myocardial infa rct ion , or graft patency between the two surgical techniques . However , patients who underwent endoscopic harvesting were found to have significantly lower incidences of wound infection , hematoma formation , and paresthesia . However , the available evidence suggests that the endoscopic approach is associated with superior perioperative outcomes without clear evidence demonstrating compromised patency or survival outcomes
Objective The radial artery has been demonstrated to provide superior long-term patency outcomes compared with saphenous veins for selected patients who undergo coronary artery bypass graft surgery . Recently , endoscopic radial artery harvesting has been popularized to improve cosmetic and perioperative outcomes . However , concerns have been raised regarding the effects on long-term survival and graft patency of this relatively novel technique . The present meta- analysis aim ed to assess the safety and the efficacy of endoscopic radial artery harvesting versus the conventional open approach .
BACKGROUND For coronary surgery we often use the radial artery ( RA ) instead of the saphenous vein , trying to exploit the advantages offered by this conduit . To eliminate the problems regarding alteration of upper-extremity function after RA procurement related to the st and ard conventional harvesting technique , we started using the less invasive harvesting technique with surprisingly good preliminary results . To compare the outcomes of open versus less invasive harvesting procedures , a prospect i ve , nonr and omized study was developed by 2 centers . METHODS From January 2001 to March 2003 , there were 87 consecutive patients in the less invasive radial artery harvesting ( LIRAH ) group and 90 patients in the conventional radial artery harvesting ( CRAH ) group . Patient characteristics and demographics were similar in the groups . Data collection was made to evaluate possible benefits of the LIRAH technique in terms of fewer forearm and h and complications , better aesthetics , and improved patient satisfaction . RESULTS Between January 11 , 2001 , and March 30 , 2003 , 177 patients underwent either primary or redo coronary artery revascularizations with procurement of the RA for use as a conduit with the less invasive harvesting technique . The mean follow-up was 2 months . Four patients died , and overall mortality was 2.26 % . One hundred seventy-three patients were successfully examined during the first postoperative control , 85 in the LIRAH group and 88 patients in the CRAH group . Objective and subjective data were collected from the consultant . The overall average age was 60.5 years ( range , 40 - 77 years ) . In the LIRAH group , the mean overall incision length ( when 2 incisions were necessary , both incision lengths were measured ) was 5.6 cm ( range , 4 - 10 cm ) , and the mean vessel length was 16 cm ( range , 10 - 19 cm ) . Eighteen patients ( 20.6 % ) necessitated double incision . Mean harvesting time ( from incision to skin closure ) was 43.3 min ( range , 25 - 70 min ) . Fourteen patients ( 16.4 % ) presented some kind of complication during the study . There were no cases with acute ischemia , bleeding , or re-exploration . Seventy-five patients ( 88.2 % ) found the cosmetic result excellent . Ten patients ( 11.8 % ) found it good , and none considered it mediocre . In the CRAH group , the mean incision length was 20 cm ( range , 18 - 22 cm ) , and the mean vessel length was 18 cm ( range , 17 - 20 cm ) . Mean harvesting time ( from incision to skin closure ) was 30.8 min ( range , 14 - 45 min ) . Thirty-four patients ( 38.6 % ) presented some kind of complication during the study . Three patients ( 3.5 % ) found the cosmetic result excellent . Forty-three ( 48.8 % ) found it good , and 42 ( 47.7 % ) considered it mediocre . CONCLUSIONS A potential of fewer neurological forearm postoperative complications , better aesthetics , and improved patient satisfaction can be achieved by the LIRAH technique OBJECTIVE To compare the endothelial integrity of radial artery grafts harvested by minimally invasive surgery and arteries harvested conventionally for coronary artery bypass surgery ( CABG ) in 200 participants , who were assigned to interventions by using r and om allocation . METHODS An immunohistochemical procedure with monoclonal antibodies was employed to estimate CD31 antigen and endothelial nitric oxide synthase ( eNOS ) expressions - markers defining endothelial integrity . RESULTS The CD31 immunostaining revealed that the endothelial cell integrity of the minimally invasive harvested arteries was preserved in 76.1±7.4 % of the circumference of luminal endothelium , which was similar to results obtained in conventionally harvested grafts ( 77.2±9.8 % ; not significant ) . On the other h and , eNOS immunostaining indicated that the endothelial integrity of the minimally invasive harvested grafts was preserved in 75.4±10.5 % while in conventionally harvested grafts it was reduced to 42.4±14.5 % of the total luminal endothelium circumference ( P<0.05 ) . CONCLUSIONS The endothelial integrity of radial artery grafts harvested by minimally invasive surgery is better preserved than in the grafts obtained by the conventional manner . This could play an important role in improving graft patency and might represent a preliminary condition of stable functioning in coronary arterial bypasses OBJECTIVES The purpose of this study was to present radial and saphenous vein graft ( SVG ) occlusion results more than 5 years following coronary artery bypass surgery . BACKGROUND In the RAPS ( Radial Artery Patency Study ) study , complete graft occlusion was less frequent in radial artery compared with SVG 1 year post-operatively while functional occlusion ( Thrombolysis In Myocardial Infa rct ion flow grade 0 , 1 , 2 ) was similar . METHODS A total of 510 patients < 80 years of age undergoing primary isolated nonemergent coronary artery bypass grafting with 3-vessel disease were initially enrolled in 9 Canadian centers . Target vessels for the radial artery and study SVG were the right and circumflex coronary arteries , which had > 70 % proximal stenosis . Within-patient r and omization was performed ; the radial artery was r and omized to either the right or circumflex territory and the study SVG was used for the other territory . The primary endpoint was functional graft occlusion by invasive angiography at least 5 years following surgery . Complete graft occlusion by invasive angiography or computed tomography angiography was a secondary endpoint . RESULTS A total of 269 patients underwent late angiography ( 234 invasive angiography , 35 computed tomography angiography ) at a mean of 7.7 ± 1.5 years after surgery . The frequency of functional graft occlusion was lower in radial arteries compared with SVGs ( 28 of 234 [ 12.0 % ] vs. 46 of 234 [ 19.7 % ] ; p = 0.03 by McNemar 's test ) . The frequency of complete graft occlusion was also significantly lower in radial compared with SVGs ( 24 of 269 [ 8.9 % ] vs. 50 of 269 [ 18.6 % ] ; p = 0.002 ) . CONCLUSIONS Radial arteries are associated with reduced rates of functional and complete graft occlusion compared with SVGs more than 5 years following surgery . ( Multicentre Radial Artery Patency Study : 5 Year Results ; NCT00187356 ) BACKGROUND Endoscopic radial artery harvest provides better cosmetic result without compromising the quality of the graft . We sought to compare postoperative harvesting site neurologic and vascular outcome . METHODS From 10/2002 until 10/2004 , 50 patients were r and omized to have their radial artery harvested for coronary bypass either endoscopically ( group A , n = 25 ) or conventionally ( group B , n = 25 ) . Radial arteries were preoperatively evaluated by Doppler echocardiography . Neurologic and functional status was assessed by a self reporting question naire with a semiquantitative ( 1 - 5 ) scale . Vascular status of the forearm was assessed by control echocardiography . RESULTS At an average follow-up of 37 + /- 7 months , patients undergoing endoscopic radial artery harvesting had less overall neurologic complications ( 11 versus 17 patients , P = .023 ) and they were less severe ( 0.8 + /- 1.1 versus 2.2 + /- 1.2 ; P < .001 ) . Ulnar flow increase was similar among the groups : 13.1 + /- 5.43 cm/s in group A versus 15.9 + /- 4.9 cm/s in group B ( P = .147 ) as well as ulnar artery diameter increase 0.29 + /- 0.16 mm in group A versus 0.29 + /- 0.26 cm in group B ( P = .914 ) . CONCLUSION Endoscopic radial artery is safe and does not compromise graft quality or forearm and h and circulation postoperatively . Along with providing a better cosmetic result , endoscopic artery harvesting reduces postoperative harvesting site pain and neurologic complications BACKGROUND We report the 5-year results of the SYNTAX trial , which compared coronary artery bypass graft surgery ( CABG ) with percutaneous coronary intervention ( PCI ) for the treatment of patients with left main coronary disease or three-vessel disease , to confirm findings at 1 and 3 years . METHODS The r and omised , clinical SYNTAX trial with nested registries took place in 85 centres in the USA and Europe . A cardiac surgeon and interventional cardiologist at each centre assessed consecutive patients with de-novo three-vessel disease or left main coronary disease to determine suitability for study treatments . Eligible patients suitable for either treatment were r and omly assigned ( 1:1 ) by an interactive voice response system to either PCI with a first-generation paclitaxel-eluting stent or to CABG . Patients suitable for only one treatment option were entered into either the PCI-only or CABG-only registries . We analysed a composite rate of major adverse cardiac and cerebrovascular events ( MACCE ) at 5-year follow-up by Kaplan-Meier analysis on an intention-to-treat basis . This study is registered with Clinical Trials.gov , number NCT00114972 . FINDINGS 1800 patients were r and omly assigned to CABG ( n=897 ) or PCI ( n=903 ) . More patients who were assigned to CABG withdrew consent than did those assigned to PCI ( 50 vs 11 ) . After 5 years ' follow-up , Kaplan-Meier estimates of MACCE were 26·9 % in the CABG group and 37·3 % in the PCI group ( p<0·0001 ) . Estimates of myocardial infa rct ion ( 3·8 % in the CABG group vs 9·7 % in the PCI group ; p<0·0001 ) and repeat revascularisation ( 13·7%vs 25·9 % ; p<0·0001 ) were significantly increased with PCI versus CABG . All-cause death ( 11·4 % in the CABG group vs 13·9 % in the PCI group ; p=0·10 ) and stroke ( 3·7%vs 2·4 % ; p=0·09 ) were not significantly different between groups . 28·6 % of patients in the CABG group with low SYNTAX scores had MACCE versus 32·1 % of patients in the PCI group ( p=0·43 ) and 31·0 % in the CABG group with left main coronary disease had MACCE versus 36·9 % in the PCI group ( p=0·12 ) ; however , in patients with intermediate or high SYNTAX scores , MACCE was significantly increased with PCI ( intermediate score , 25·8 % of the CABG group vs 36·0 % of the PCI group ; p=0·008 ; high score , 26·8%vs 44·0 % ; p<0·0001 ) . INTERPRETATION CABG should remain the st and ard of care for patients with complex lesions ( high or intermediate SYNTAX scores ) . For patients with less complex disease ( low SYNTAX scores ) or left main coronary disease ( low or intermediate SYNTAX scores ) , PCI is an acceptable alternative . All patients with complex multivessel coronary artery disease should be review ed and discussed by both a cardiac surgeon and interventional cardiologist to reach consensus on optimum treatment . FUNDING Boston Scientific We present a less traumatic surgical technique for harvesting the radial artery as a coronary artery bypass graft that does not require any special equipment or skills . We prospect ively r and omized 40 patients undergoing coronary artery bypass grafting with the radial artery into two groups on the basis of harvest techniques : tunneling excision and conventional open method . The less-invasive tunneling technique is safe , easily applicable , and preferred by patients because of the superior cosmetic result BACKGROUND In the past decade , the radial artery has frequently been used for coronary bypass surgery despite concern regarding the possibility of graft spasm . Graft patency is a key predictor of long-term survival . We therefore sought to determine the relative patency rate of radial-artery and saphenous-vein grafts in a r and omized trial in which we controlled for bias in the selection of patients and vessels . METHODS We enrolled 561 patients at 13 centers . The left internal thoracic artery was used to bypass the anterior circulation . The radial-artery graft was r and omly assigned to bypass the major vessel in either the inferior ( right coronary ) territory or the lateral ( circumflex ) territory , with the saphenous-vein graft used for the opposing territory ( control ) . The primary end point was graft occlusion , determined by angiography 8 to 12 months postoperatively . RESULTS Angiography was performed at one year in 440 patients : 8.2 percent of radial-artery grafts and 13.6 percent of saphenous-vein grafts were completely occluded ( P=0.009 ) . Diffuse narrowing of the graft ( the angiographic " string sign " ) was present in 7.0 percent of radial-artery grafts and only 0.9 percent of saphenous-vein grafts ( P=0.001 ) . The absence of severe native-vessel stenosis was associated with an increased risk of occlusion of the radial-artery graft and diffuse narrowing of the graft . Harvesting of the radial artery was well tolerated . CONCLUSIONS Radial-artery grafts are associated with a lower rate of graft occlusion at one year than are saphenous-vein grafts . Because the patency of radial-artery grafts depends on the severity of native-vessel stenosis , such grafts should preferentially be used for target vessels with high- grade lesions BACKGROUND Vein-graft harvesting with the use of endoscopy ( endoscopic harvesting ) is a technique that is widely used to reduce postoperative wound complications after coronary-artery bypass grafting ( CABG ) , but the long-term effects on the rate of vein-graft failure and on clinical outcomes are unknown . METHODS We studied the outcomes in patients who underwent endoscopic harvesting ( 1753 patients ) as compared with those who underwent graft harvesting under direct vision , termed open harvesting ( 1247 patients ) , in a secondary analysis of 3000 patients undergoing CABG . The method of graft harvesting was determined by the surgeon . Vein-graft failure was defined as stenosis of at least 75 % of the diameter of the graft on angiography 12 to 18 months after surgery ( data were available in an angiographic subgroup of 1817 patients and 4290 grafts ) . Clinical outcomes included death , myocardial infa rct ion , and repeat revascularization . Generalized estimating equations were used to adjust for baseline covariates associated with vein-graft failure and to account for the potential correlation between grafts within a patient . Cox proportional-hazards modeling was used to assess long-term clinical outcomes . RESULTS The baseline characteristics were similar between patients who underwent endoscopic harvesting and those who underwent open harvesting . Patients who underwent endoscopic harvesting had higher rates of vein-graft failure at 12 to 18 months than patients who underwent open harvesting ( 46.7 % vs. 38.0 % , P<0.001 ) . At 3 years , endoscopic harvesting was also associated with higher rates of death , myocardial infa rct ion , or repeat revascularization ( 20.2 % vs. 17.4 % ; adjusted hazard ratio , 1.22 ; 95 % confidence interval [ CI ] , 1.01 to 1.47 ; P=0.04 ) , death or myocardial infa rct ion ( 9.3 % vs. 7.6 % ; adjusted hazard ratio , 1.38 ; 95 % CI , 1.07 to 1.77 ; P=0.01 ) , and death ( 7.4 % vs. 5.8 % ; adjusted hazard ratio , 1.52 ; 95 % CI , 1.13 to 2.04 ; P=0.005 ) . CONCLUSIONS Endoscopic vein-graft harvesting is independently associated with vein-graft failure and adverse clinical outcomes . R and omized clinical trials are needed to further evaluate the safety and effectiveness of this harvesting technique BACKGROUND Radial arteries are being used more often for coronary artery bypass grafting . A minimally invasive technique was devised for harvesting vessels and compared with the traditional harvesting technique . METHODS In a prospect i ve study of 200 consecutive patients undergoing coronary artery bypass grafting , 100 patients had traditional open radial artery harvesting and 100 underwent endoscopic radial artery harvesting . All patients had a preoperative modified Allen 's test with Doppler imaging . The traditional technique involved a longitudinal incision over the radial aspect of the arm from the wrist to the antecubital fossa . The radial artery was dissected subfascially and removed . The endoscopic technique involved a 3-cm incision over the radial aspect of the arm . A vessel loop was placed around the artery and carbon dioxide was insufflated into the wound . The radial artery was dissected to the brachial artery and ligated with an Endo-loop ligature . The branches were divided with bipolar electrocautery and ligated with clips . Patients were evaluated for postoperative pain , bleeding , neuralgias , infection , and any adverse events . A p value of less than 0.05 was considered significant . RESULTS All 200 radial arteries were successfully harvested and used as grafts . Patients who had undergone endoscopic radial artery harvesting had significantly fewer major complications than patients who underwent the open technique : hematomas ( five versus no complications ) or wound infections requiring antibiotics ( seven versus one complication ) . The occurrence of major neuralgias that restricted function were also significantly lower postoperatively and 1 , 3 , and 6 months later ( ten versus one , eight versus one , five versus zero , and one versus zero , respectively ) . CONCLUSIONS Endoscopic radial artery harvesting results in good cosmetic results , useable grafts , and minimal neuralgias . Endoscopic radial artery harvesting is better than traditional open radial artery harvesting In 1975 , 80 patients undergoing revascularization were prospect ively r and omized to receive either a greater saphenous vein ( SV ) graft ( 41 patients , Group 1 ) or a left internal mammary artery ( LIMA ) graft ( 39 patients , Group 2 ) to the left anterior descending coronary artery ( LAD ) . All patients were completely revascularized . The average number of grafts per patient in both groups was 3.2 . Patients were followed 10 years ; follow-up was 97.5 % complete . Group 1 and Group 2 were compared in regard to mortality , treadmill response , myocardial infa rct ion , reoperation , percutaneous transluminal coronary angioplasty , and return to work . Mortality in Group 1 was 17.9 % versus 7.7 % in Group 2 ( p less than 0.05 ) . Treadmill studies were positive in 17 Group 1 patients and 7 Group 2 patients ( p less than 0.05 ) . Myocardial infa rct ions occurred in 8 patients in Group 1 versus 3 in Group 2 . The number of reoperations was 2 in Group 1 versus 1 in Group 2 . Percutaneous transluminal coronary angioplasty was performed in 3 patients in Group 1 and 2 in Group 2 . Repeat studies revealed 76.3 % patency of the SV graft to the LAD ( Group 1 ) and 94.6 % patency of the LIMA graft to the LAD ( Group 2 ) . Cardiac-related mortality in Group 1 was 12.8 % at 10 years ( 5 patients ) versus 7.7 % in Group 2 ( 3 patients ) . Based on this study , the IMA is a superior conduit for bypass to the LAD The aim of the study was to compare three different methods of radial artery harvesting with regard to postoperative complications and perioperative stress of the patient . A total of 60 patients admitted for coronary artery bypass surgery were r and omized into three groups . Each patient underwent extraction of radial artery , all performed by a single surgeon . The radial artery was harvested by one of the following three techniques : classical technique ( 20 patients ) , mini-invasive technique ( 20 ) , and endoscopic technique ( 20 ) . The time required for the graft harvest was greater in the group where the endoscopic technique was used ( 52.6 ± 11.3 min ) than with the mini-invasive ( 41.5 ± 7.3 min ) or the classical ( 27.8 ± 4.6 min ) technique . Postoperative blood loss into drains was higher where the classical technique was used ( 35.5 ± 9.4 ml ) as compared to the mini-invasive ( 20 ± 5 ml ) or the endoscopic ( 10 ± 7.3 ml ) technique . There was no significant difference among the groups in the rate of local neurological complications , contusion of wound edge , edema of the extremity , or wound infection rate . We observed no case of ischemia of the extremity , and a single case of postoperative myocardial ischemia in the group where the classical technique was used . From a clinical point of view , the mini-invasive and the endoscopic approach are comparable , but the latter is more expensive . Both mini-invasive and endoscopic techniques prolong the operation , reduce perioperative blood loss , and require additional training time BACKGROUND We sought to assess our initial experience with the recently introduced technique of endoscopic radial artery harvest ( ERH ) for coronary artery bypass grafting ( CABG ) . METHODS Data were prospect ively collected on 108 consecutive patients undergoing isolated CABG with ERH , and compared to 120 patients having conventional harvest ( CH ) . Follow-up was achieved in 227 patients ( 99 % ) . At the time of follow-up the severity of motor and sensory symptoms , as well as cosmetic result in the harvest forearm , were subjectively grade d using a 5-point scale . Grade 1 - - high intensity deficits , poor cosmetic result . Grade 5 -- no deficits , excellent cosmetic result . RESULTS Hospital mortality , myocardial infa rct ion , and stroke rates were similar between the groups . Follow-up mortality , reintervention rate , and average angina class were also similar . Harvest time was longer in the ERH group ( 61 + /- 24 min vs. 45 + /- 11 min , p < 0.001 ) . Three patients in the ERH group were converted to CH and one radial artery was discarded . There were no vascular complications of the h and in either group . Average score of motor ( ERH 4.4 + /- 0.9 , CH 4.2 + /- 1.0 ) or sensory symptoms ( ERH 3.7 + /- 1.1 , CH 3.8 + /- 1.2 ) were similar . In the CH group sensory deficits were observed in the distribution of both the lateral antebrachial cutaneous and the superficial radial nerves ( SRN ) . In contrast , sensory deficits in the ERH group were limited to the distribution of the SRN . Cosmetic result score was higher in the ERH group ( ERH 4.2 + /- 1.0 , CH 3.1 + /- 1.4 , p < 0.0001 ) . CONCLUSIONS ERH is safe . It is technically dem and ing with a significant learning curve . Motor and sensory symptoms are not completely eliminated by using a smaller incision , but cosmetic results are clearly superior BACKGROUND The radial artery ( RA ) is a commonly used arterial conduit in coronary artery bypass grafting ( CABG ) . Traditional open-vessel harvest often leads to postoperative wound complications and cosmetic problems . Endoscopic RA harvesting ( ERAH ) has been widely used to prevent these problems . The purpose of this study was to assess these problems and graft patency in the first 50 patients who underwent ERAH . METHODS Between February 2006 and October 2007 , 50 patients underwent ERAH with the VasoView system ( Boston Scientific ) . These patients were compared with 50 patients who underwent the traditional open technique . RESULTS The mean age was 62.8 years in both groups . All RAs were successfully harvested . No conversion was made from ERAH to the traditional open technique . The mean harvesting time ( forearm ischemic time ) was 27.4 + or - 6.5 minutes , and the mean length of the RA in the ERAH group was 18.5 cm . Neither wound complications , such as wound infection and skin necrosis , nor severe neurologic complications were recorded . The patency rate was 95.9 % ( 95/99 ) in the ERAH group and 94 % ( 94/100 ) in the open group . CONCLUSION ERAH can be performed safely , and the early results are satisfactory . Endoscopic vessel harvesting is therefore recommended as the technique of choice for RA harvesting Harvest of the saphenous vein is a commonly performed procedure in cardiovascular surgery . The incision required for its removal is the longest used anywhere . In this report , the authors describe a minimally invasive technique for removal of the vein . This has been used in 30 patients undergoing peripheral arterial bypass ( n = 27 ) , venovenous bypass ( n = 2 ) , and a saphenopopliteal fistula ( n = 1 ) . There were three perioperative complications : skin necrosis over tunnel ( one ) , bulla ( one ) , and saphenous vein injury ( one ) . Harvest time averaged 1.25 h. There was minimal postoperative discomfort in the harvest site and minimal scarring . Endoscopic harvest of the saphenous vein differs from most laparoscopic procedures because of its linear course . Consequently , visualization and dissection is coaxial rather than triangulation . This study demonstrates the technical feasibility of vein harvest . Development of appropriate instrumentation for opening the optical cavity and vein manipulation will reduce operative times
11,049
25,833,333
Colonic J pouch was associated with lower stool frequency and antidiarrhoeal medication use for up to 1 year after surgery compared with straight CAA . Transverse coloplasty and side-to-end CAA had similar functional outcomes to the colonic J pouch . No superiority was found for any of the techniques in terms of anastomotic leak rate . CONCLUSION Colonic J pouch and side-to-end CAA or transverse coloplasty lead to a better functional outcome than straight CAA for the first year after surgery
BACKGROUND Options for reconstruction after low anterior resection ( LAR ) for rectal cancer include straight or side-to-end coloanal anastomosis ( CAA ) , colonic J pouch and transverse coloplasty . This systematic review compared these techniques in terms of function , surgical outcomes and quality of life .
Background Functional outcome after rectal excision with coloanal anastomosis is improved by construction of a colonic J pouch . Present prospect i ve r and omized studies lack follow‐up beyond 1 year . The aim of this study was to assess the clinical outcome at both short‐ and long‐term follow‐UP Objective To assess the efficacy of a novel coloplasty colonic pouch design in optimizing bowel function after ultralow anterior resection . Summary Background Data A colonic J-pouch may reduce excessive stool frequency and incontinence after anterior resection , but at the risk of evacuation problems . Experimental surgery on pigs has suggested that a coloplasty pouch ( CP ) may be a useful alternative . Although CP has recently been shown to be feasible in patients , there is no r and omized controlled trial comparing bowel function with the J-pouch . Methods After anterior resection for cancer , patients were allocated to either J-pouch or CP-anal anastomoses . Continence scoring , anorectal manometry , and endoanal ultrasound assessment s were made before surgery . All complications were recorded , and these preoperative assessment s were repeated at 4 months . The assessment s were repeated again at 1 year , and a quality of life question naire was added . Results Eighty-eight patients were recruited from October 1998 to April 2000 . Both groups were well matched for age , gender , staging , adjuvant therapy , and mean follow-up . There were no differences in the intraoperative time and hospital stay . CP result ed in more anastomotic leaks . At 4 months , J-pouch patients had 10.3 % less stool fragmentation but poorer stool deferment and more nocturnal leakage . However , there were no differences in the bowel function , continence score , and quality of life at 1 year . There were no differences in the anorectal manometry and endoanal ultrasound findings . Conclusions Coloplasty pouches result ed in more anastomotic leaks and minimal differences in bowel function . At present , the J-pouch remains the benchmark for routine clinical practice , and due care ( including defunctioning stoma ) should be exercised in situations requiring CP Objectives To compare a colonic J-pouch or a side-to-end anastomosis after low-anterior resection for rectal cancer with regard to functional and surgical outcome . Summary Background Data A complication after restorative rectal surgery with a straight anastomosis is low-anterior resection syndrome with a postoperatively deteriorated anorectal function . The colonic J-reservoir is sometimes used with the purpose of reducing these symptoms . An alternative method is to use a simple side-to-end anastomosis . Methods One-hundred patients with rectal cancer undergoing total mesorectal excision and colo-anal anastomosis were r and omized to receive either a colonic pouch or a side-to-end anastomosis using the descending colon . Surgical results and complications were recorded . Patients were followed with a functional evaluation at 6 and 12 months postoperatively . Results Fifty patients were r and omized to each group . Patient characteristics in both groups were very similar regarding age , gender , tumor level , and Dukes ’ stages . A large proportion of the patients received short-term preoperative radiotherapy ( 78 % ) . There was no significant difference in surgical outcome between the 2 techniques with respect to anastomotic height ( 4 cm ) , perioperative blood loss ( 500 ml ) , hospital stay ( 11 days ) , postoperative complications , reoperations or pelvic sepsis rates . Comparing functional results in the 2 study groups , only the ability to evacuate the bowel in < 15 minutes at 6 months reached a significant difference in favor of the pouch procedure . Conclusions The data from this study show that either a colonic J-pouch or a side-to-end anastomosis performed on the descending colon in low-anterior resection with total mesorectal excision are methods that can be used with similar expected functional and surgical results The efficacy of colon-J-pouch anal anastomosis ( CPAA ) in reducing defecatory frequency and urgency and the incidence of anastomotic fistulas has been proved by several studies but only as compared to straight colo-anal anastomosis ( CAA ) of the end-to-end type . We investigated the role played by the colon pouch in the strict sense , without the influence of a different CAA model , in a r and omised prospect i ve study comparing CPAA and straight side-to-end CAA . Over the period from 1994 to 1998 we selected 66 of 118 patients operated on for rectal cancer : a CPAA was constructed in 35 ( group P ) and a direct side-to-end CAA in 31 ( group D ) . The two groups were well matched for surgeon , type of patient , stage of disease and incidence of radiotherapy and presented no differences in operative mortality , general and anastomotic morbidity , or need for reoperation . Functional results : after 3 , 12 and 36 months , defecatory frequency > or = 4 movements/day was observed in 93.4 , 67.7 and 41.6 % of cases , respectively , in group D as against 25.7 , 14.2 and 13 % , respectively , in group P ( P < 0.05 ) , while defecatory urgency was recorded in 77.4 , 35.4 and 27.9 % of cases , respectively , in group D as against 34.2 , 17.1 and 9 % , respectively , in group P ( p < 0.05 ) . In the long term , incontinence was also significantly lower in group P. The colon pouch improves sphincter rehabilitation after anal recanalization compared to straight side-to-end CAA . It does not affect anastomotic morbidity but affords a protective effect on function in irradiated patients . CPAA proves to be the optimal reconstruction option after excision of the rectum Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more Introduction : Colonic pouches have been used for 20 years to provide reservoir function after reconstructive proctectomy for rectal cancer . More recently coloplasty has been advocated as an alternative to a colonic pouch . However there have been no long-term r and omized , controlled trials to compare functional outcomes of coloplasty , colonic J-Pouch ( JP ) , or a straight anastomosis ( SA ) after the treatment of low rectal cancer . Aim : To compare the complications , long-term functional outcome , and quality of life ( QOL ) of patients undergoing a coloplasty , JP , or an SA in reconstruction of the lower gastrointestinal tract after proctectomy for low rectal cancer . Methods : A multicenter study enrolled patients with low rectal cancer , who were r and omized intraoperatively to coloplasty ( CP-1 ) or SA if JP was not feasible , or JP or coloplasty ( CP-2 ) if a JP was feasible . Patients were followed for 24 months with SF-36 surveys to evaluate the QOL . Bowel function was measured quantitatively and using Fecal Incontinence Severity Index ( FISI ) . Urinary function and sexual function were also assessed . Results : Three hundred sixty-four patients were r and omized . All patients were evaluated for complications and recurrence . Mean age was 60 ±12 years , 71 % were male . Twenty-three ( 7.4 % ) died within 24 months of surgery . No significant difference was observed in the complications among the 4 groups . Two hundred ninety-seven of 364 were evaluated for functional outcome at 24 months . There was no difference in bowel function between the CP-1 and SA groups . JP patients had fewer bowel movements , less clustering , used fewer pads and had a lower FISI than the CP-2 group . Other parameters were not statistically different . QOL scores at 24 months were similar for each of the 4 groups . Conclusions : In patients undergoing a restorative resection for low rectal cancer , a colonic JP offers significant advantages in function over an SA or a coloplasty . In patients who can not have a pouch , coloplasty seems not to improve the bowel function of patients over that with an SA Objectives The introduction of the colonic J‐pouch has markedly improved the functional outcome of restorative rectal cancer surgery . However colonic J‐pouch surgery can be problematic and may present some late evacuatory problems . To overcome these limitations a novel pouch has been proposed : the transverse coloplasty pouch . The purpose of our study was to compare the functional outcomes of these two different types of pouches – the transverse coloplasty pouch ( TCP ) and the colonic J‐pouch ( CJP ) – during the first 12 months postoperatively OBJECTIVE The authors compared clinical bowel function and complications of a low anterior resection with either a straight or colonic J pouch anastomosis . SUMMARY BACKGROUND DATA Urgency and frequent bowel movements after rectal resection with a low anastomosis have been related to the loss of rectal reservoir function . Reconstruction with a colonic J pouch possibly can obviate some of this dysfunction . Earlier reports have been favorable , but they must be verified in r and omized trials . METHOD One hundred patients with rectal cancer in whom a sphincter-saving procedure was appropriate were r and omized to reconstruction with either a straight or a colonic J pouch anastomosis . RESULTS The incidence of symptomatic anastomotic leakage was lower in the pouch group ( 2 % vs. 15 % , p = 0.03 ) . Eighty-nine patients could be evaluated after 1 year . The pouch patients had significantly fewer bowel movements per 24 hours , and less nocturnal evacuations , urgency , and incontinence . Overall well-being owing to the bowel function was rated significantly higher by the pouch patients . CONCLUSION Reconstruction with a colonic J pouch was associated with a lower incidence of anastomotic leakage and better clinical bowel function when compared with the traditional straight anastomosis . Functional superiority was especially evident during the first 2 months Twenty patients ( 13 men ) with low rectal cancer , median ( range ) age 64.5 ( 38–83 ) years were prospect ively r and omized to undergo ultra‐low anterior resection with a J colonic pouch‐anal anastomosis ( median ( range ) distance of anastomosis from the anal verge 3 ( 1–4 ) cm ) . Another 20 patients ( 15 men ) , median ( range ) age 62.5 ( 44–86 ) years ) with low rectal cancer were r and omized to a straight coloanal anastomosis ( median ( range ) distance of anastomosis from the anal verge 3.25 ( 2–5 ) cm ) . There were no significant differences in operative time or complications between the two groups . There was significantly better postoperative anal function in patients who underwent pouch‐anal anastomosis at 1 , 6 and 12 months after ileostomy closure . At 12 months all patients ( 19 of 19 ) with a pouch reconstruction had regained normal continence compared with 14 of 20 of those who had a straight coloanal anastomosis . No patient complained of severe constipation requiring enema or intubation to evacuate Abstract BACKGROUND : Low anterior resection with coloanal anastomosis prevents a definitive stoma in patients with distal rectal cancer . However , imperative stool urge , stool fragmentation , prolonged stooling sessions , and minor problems of incontinence are frequently observed in the postoperative situation and negatively affect quality of life . Therefore , the colonic J-pouch was originally constructed to create a stool reservoir . In a r and omized , prospect i ve study , the short ( 5 cm ) colonic J-pouch was tested for function and continence vs. straight coloanal anastomosis . METHODS : Over a period of 30 months , 74 consecutive patients ( 55 males ) with rectal cancer in the lower and middle third of the rectum were included and r and omized into two groups . Anastomosis was performed either as a coloanal or a colon-pouch-anal anastomosis . The st and ardized surgical procedure included mobilization of the left hemicolon , central ligation of the inferior mesenteric artery and vein , preaortal lymph node dissection , autonomic nerve preservation , and total mesorectal excision . The anastomosis was performed at the upper anal canal or at the intersphincteric level . All patients were evaluated preoperatively and six months postoperatively for fecal continence , including sphincter manometry and defecation habits . In addition , quality of life was determined by use of a st and ardized question naire ( European Organization for Research and Treatment of Cancer , EORTC-QLQ-C30 ) . RESULTS : Thirty-seven patients were r and omized into each group . In general , problems with continence for liquids or gas occurred less frequently in the colonic J-pouch group 6 months after surgery . The frequency of bowel movements was lower in the J-pouch group ( 2.5 per day ) than in the coloanal group ( 4.7 per day ) . Importantly , in a manometric study at the same postoperative point , neorectal capacity was decreased to a similar degree in both groups compared with the preoperative rectal volume . Thus , the expected and postulated reservoir effect could not be achieved by forming a 5-cm colonic J-pouch . CONCLUSION : The colonic J-pouch was superior with regard to continence for gas and liquids compared with a straight coloanal anastomosis . Furthermore , stool frequency was significantly lower in the J-pouch group than in the coloanal reconstruction group . However , because neorectal capacity decreased equally in both groups , we speculate that the advantage of the colonic J-pouch is not in the creation of a larger neorectal reservoir but rather may be related to decreased motility BACKGROUND The colonic pouch is considered as an alternative to the st and ard straight low anastomosis after resection for rectal cancer . The aim of this prospect i ve r and omized trial was to compare short- and long-term functional results of colonic J-pouch ( CJP ) and transverse coloplasty ( TCP ) after low anterior resection for rectal cancer . METHODS Between 2000 and 2005 , patients with mid or low rectal cancer scheduled for an elective sphincter-preserving resection were eligible . The primary end point was to compare bowel functional results 6 months and 3 years after ileostomy closure . Fecal incontinence score and a question naire that included items for clinical evaluation of bowel function were used . RESULTS One-hundred six patients were r and omized ; 54 patients were allocated to the CJP group and 52 in the TCP group . There were no differences between the 2 groups in terms of demographic and clinical data . Overall , postoperative complication rate was 19.8 % without differences between the groups . Two patients ( 1.9 % ; one in each group ) presented with anastomotic dehiscence . Long-term incomplete evacuation rates were 29.2 % in the CPT group and 33.3 % in the CJP group , without substantial differences . Overall , short- and long-term functional outcomes of both procedures were comparable . No differences were observed in terms of fecal incontinence or in all the items included in the question naire . CONCLUSION TCP reconstruction after rectal cancer resection and coloanal anastomosis is functionally similar to CJP both in short- and long-term outcomes . The TCP technique does not seem to improve significantly the incomplete defecation symptom respect to CJP . REGISTRATION NUMBER NCT01396928 ; http://register . clinical trial.gov PURPOSE : Colonic pouches have gained increasing popularity in reconstruction after low anterior resection . In this prospect i ve , r and omized trial colonic pouch reconstruction is compared with side-to-end anastomosis for functional outcome . METHODS : From October 1995 to October 1996 , 29 patients had colonic pouch and 30 patients had side-to-end anastomosis reconstruction after low anterior resection . Patients were matched for age , gender , and tumor stage and localization . All patients underwent functional evaluation preoperatively and at three and six months post-operatively . RESULTS : There was no difference in preoperative anorectal function . The operating time was higher in the colonic pouch group ( 167vs . 149 minutes ) . Twenty-three patients ( 79.3 percent ) with colonic pouch had a protective stoma compared with 21 patients ( 70 percent ) with side-to-end anastomosis . Postoperative complications were 10.3 and 13.3 percent , respectively . There was no difference in manometric pressure of the anus , in anorectal angle , and in continence status after three and six months . Stool frequency was higher in the side-to-end anastomosis group , with 2.2vs . 5.4 per day at three months and 2.3vs . 3.1 per day at six months . Constipation was noted in two patients with colonic pouch ( 7 percent ) and none in the side-to-end anastomosis group at three months and twovs . none at six months . Maximum tolerated volume and threshold volume was higher in the colonic pouch group at three and at six months . CONCLUSION : Both forms of reconstruction have similar satisfactory long-term functional results . The major advantage of colonic pouch was seen in the immediate postoperative phase Background To compare the functional and surgical outcomes of colonic J-pouch and straight anastomosis in the context that both reconstruction procedures were performed laparoscopically . Methods The present study was a r and omized prospect i ve clinical trial . Patients with lower rectal cancer requiring laparoscopic total mesorectal excision were equally r and omized to either laparoscopic-assisted colonic J-pouch reconstruction or laparoscopic straight end-to-end anastomosis . The techniques of the laparoscopic-assisted colonic J-pouch reconstruction are shown in the attached video . The primary end point was the comparison of functional results in both reconstruction methods . The secondary end points included the safety ( surgical morbidity and mortality ) , surgical efficiency , and postoperative recovery . Results A total of 48 patients were recruited within 2-year periods , in consideration of statistical power of 90 % for comparison . There was no marked difference between patient groups undergoing colonic J-pouch surgery ( n = 24 ) and straight anastomosis ( n = 24 ) in various demographic and clinicopathogic parameters . The anorectal function of patients by colonic J-pouch were better than those by straight anastomosis in 3 months after operation , as evaluated by stool frequency ( mean ± st and ard deviation : 4.0 ± 2.0 vs. 7.0 ± 2.4 times/day , P < .001 ) ; use of antidiarrheal agents ( 29.2 % [ n = 7 ] vs. 75.0 % [ n = 18 ] , P = .004 ) ; and perineal irritation ( 45.8 % [ n = 11 ] vs. 79.2 % [ n = 19 ] , P = .037 ) . Because of the relatively better bowel function in immediate postoperative period , patients by colonic J-pouch reconstruction were less disabled after surgery and had quicker return to partial activity ( P = .039 ) , full activity ( P < .001 ) , and work ( P < .001 ) . Both reconstruction methods were performed with similar amounts of blood loss , complication rates , and postoperative recovery . However , the operation time was significantly longer in the colonic J-pouch group ( 274.4 ± 34.0 vs. 202.0 ± 28.0 minutes , P < .001 ) . Conclusions Because laparoscopic-assisted creation of a colonic J-pouch achieved better short-term functional results of the anorectum and did not increase surgical morbidity , as compared with laparoscopic straight anastomosis , this reconstruction procedure could be recommended to patients with lower rectal cancer requiring laparoscopic total mesorectal excision PURPOSE : Different studies have shown that low colorectal and coloanal anastomosis often yield poor functional results . The aim of the present study was to investigate whether a colonic reservoir is able to improve functional results . METHODS : Thirty-eight consecutive patients subjected to low anterior resection were r and omized following rectal excision in two groups . One ( n=19 ) had a stapled straight coloanal anastomosis , and the other ( n=19 ) had a 10-cm stapled colonic pouch low rectal anastomosis . Median anastomotic distance above the anal verge was 3.38±0.56 cm and 2.14±0.36 cm in both groups , respectively . Continence alterations , urgency , tenesmus , defecatory frequency , anal resting and maximum voluntary squeezing pressures , and maximum tolerable volume were evaluated one year later . RESULTS : One patient died of pulmonary embolism , and seven presented with a recurrence and were excluded from the study . Stool frequency was greater than three movements per day in 33.3 percent of cases with a reservoir and in 73.3 percent of those with a straight coloanal anastomosis ( P<0.05 ) . Maximum tolerable volume was significantly greater in patients with a reservoir ( 335 ± 195 ) than in those without ( 148 ± 38 ) ( P<0.05 ) . There were no significant differences in other variables studied . CONCLUSIONS : This study shows that some aspects of defecatory function after rectal excision could improve with a colonic reservoir PURPOSE Functional disturbances are common after anterior resection for rectal cancer . This study was design ed to compare functional and physiologic outcome after low anterior resection and total mesorectal excision with a colonic J-pouch or a side-to-end anastomosis . METHODS Functional and physiologic variables were analyzed in patients r and omized to a J-pouch ( n = 36 ) or side-to-end anastomosis ( n = 35 ) . Postoperative functional outcome was investigated with question naires . Anorectal manometry was performed preoperatively and at six months , one year , and two years postoperatively . RESULTS There was no statistical difference in functional outcome between groups at two years . Maximum neorectal volume increased in both groups but was approximately 40 percent greater at two years in pouches compared with the side-to-end anastomosis . Anal sphincter pressures volumes were halved postoperatively and did not recover during follow-up of two years . Male gender , low anastomotic level , pelvic sepsis , and the postoperative decrease of sphincter pressures were independent factors for more incontinence symptoms . CONCLUSIONS Colonic J-pouch and side-to-end anastomosis gives comparable functional results two years after low anterior resection . Neorectal volume had no detectable influence on function . There was a pronounced and sustained postoperative decrease in sphincter pressures Functional results after rectal resection with straight coloanal anastomosis are poor . While most functional aspects are improved with coloanal J pouch anastomosis , it is still unclear whether this translates into better quality of life . The aim of this trial was to investigate health‐related quality of life as a primary endpoint in patients undergoing sphincter‐saving rectal resection BACKGROUND The patients in this study formed part of a multicentre r and omized trial comparing the straight and colonic pouch anastomoses after rectal excision for cancer . The trial reflected an advantage of the pouch group regarding frequency , urgency and incontinence . We hypothesized that such differences in bowel function would be reflected in a general quality of life score . METHODS Forty-five patients were r and omized . The Nottingham Health Profile was used to measure health-related quality of life before and 1 year after surgery . RESULTS There was no difference in quality of life score when the two groups were compared . When calculating the relative change after surgery , there was an improvement in both groups regarding the total score of the Profile ( P < 0.0001 ) . CONCLUSIONS The Nottingham Health Profile was effective in showing an improvement in quality of life in both groups after surgery . The observed difference in clinical bowel function was not , however , reflected in an improved quality of life score as measured by the Profile AIM There is some evidence of functional superiority of colonic J-pouch over straight coloanal anastomosis ( CAA ) in ultralow anterior resection ( ULAR ) or intersphincteric resection . On the assumption that colonic J-pouch anal anastomosis is superior to straight CAA in ULAR with upper sphincter excision ( USE : excision of the upper part of the internal sphincter ) for low-lying rectal cancer , we compare functional outcome of colonic J-pouch vs the straight CAA . METHODS Fifty patients of one hundred and thirty-three rectal cancer patients in whom lower margin of the tumors were located between 3 and 5 cm from the anal verge received ULAR including USE from September 1998 to January 2002 . Patients were r and omized for reconstruction using either a straight ( n = 26 ) or a colonic J-pouch anastomosis ( n = 24 ) with a temporary diverting-loop ileostomy . All patients were followed-up prospect ively by a st and ardized question naire ( Fecal Incontinence Severity Index ( FISI ) scores and Fecal Incontinence Quality of Life ( FIQL ) scales ) . RESULTS We found that , compared to straight anastomosis patients , the frequency of defecation was significantly lower in J-pouch anastomosis patients for 10 mo after ileostomy takedown . The FISI scores and FIQL scales were significantly better in J-pouch patients than in straight patients at both 3 and 12 mo after ileostomy takedown . Furthermore , we found that FISI scores highly correlated with FIQL scales . CONCLUSION This study indicates that colonic J-pouch anal anastomosis decreases the severity of fecal incontinence and improves the quality of life for 10 mo after ileostomy takedown in patients undergoing ULAR with USE for low-lying rectal cancer PURPOSE Colonic J-pouch has been constructed to overcome reservoir dysfunction after restorative rectal surgery , whereas no effort has been made for sphincter dysfunction . We conducted a prospect i ve , r and omized study comparing surgical and functional outcomes between side-to-end anastomosis and colonic J-pouch after low anterior resection in which the anastomosis was constructed from the abdomen . METHODS Fifty-six consecutive patients with middle-to-low rectal cancer undergoing low anterior resection were r and omly assigned to side-to-end or colonic J-pouch group preoperatively . Surgical outcomes of all the patients were recorded . Patients underwent functional evaluation , including anorectal manometry and functional assessment , preoperatively and then 3 months , 6 months , 1 year , and 2 years postoperatively . RESULTS Twenty-four patients in each group completed the study . The demographic data and preoperative functional assessment did not differ between the two groups . There was no significant difference in surgical outcomes with regard to anastomotic height ( 5 cm ) , blood loss , protective colostomy , operative time , complications , and adjuvant therapy . Anal pressures showed no significant change postoperatively and during the follow-up period ; there were no differences between the two groups . Temporal minor fecal incontinence was noted in the early postoperative period in both groups . With regard to bowel function , a significant reduction of volume of urgency and maximal tolerable volume was found postoperatively in both groups ; however , a faster recovery was noted in the colonic J-pouch group . Stool frequency increased significantly after surgery in both groups ; however , in contrast to rectal volume , a faster recovery was noted in the side-to-end group . CONCLUSIONS Anastomosis after low anterior resection for middle to low rectal cancer could be performed safely from the abdomen . It minimized sphincter injury and showed good continence preservation . On the other h and , the surgical outcomes and long-term functional results of side-to-end anastomosis were comparable with colonic J-pouch . Side-to-end anastomosis provides an easier , alternative way for reconstruction after restorative rectal surgery The colonic J-pouch ( pouch group ) functions better than the straight coloanal anastomosis ( straight group ) immediately after ultra-low anterior resection , but there are few studies with long-term follow-up . This r and omized controlled study compared functional outcome , anal manometry , and rectal barostat assessment of these two groups over a 2-year period . Forty-two consecutive patients were recruited , of which 19 of the straight group [ 17 men with a mean age of 62.1 ± 2.3 ( SEM ) year ] and 16 of the pouch group ( 11 men with a mean age of 61.3 ± 3.2 year ) completed the study . Four died from metastases and two emigrated ; there was no surgical morbidity or local recurrence . At 6 months the Pouch patients had significantly less frequent stools ( 32.9 ± 2.8 vs. 49 ± 1.4/week ; p<0.05 ) and less soiling at passing flatus ( 38 % vs. 73.7 % ; p<0.05 ) . At 2 years both groups had improved with no longer any differences in stool frequency ( 7.3 ± 0.4 vs. 8 ± 0.2/week ) and soiling at passing flatus ( 38 % vs. 53 % ) . Defecation problems remained minimal in both groups . Anal squeeze pressures were significantly impaired in both groups up to 2 years ( p<0.05 ) . The rectal maximum tolerable volume and compliance were not different between groups . Rectal sensory testing on the barostat phasic program showed impairment at 6 months and recovery at 2 years , suggesting that postoperative recovery of residual afferent sympathetic nerves may play a role in functional recovery . In conclusion , stool frequency and incontinence were less in the Pouch patients at 6 months ; but after adaptation at 2 years the straight group patients yielded similar results . Nonetheless , this functional advantage can be given to patients with minimal added effort or complications by using the colonic J-pouch . RésuméAprès résection antérieure du rectum ultra-basse , le réservoir colique en J ( POUCH ) fonctionne mieux que l’anastomose coloanale sans réservoir ( STRAIGHT ) , mais il existe peu d’études avec un suivi à long terme . Cette étude r and omisée , contrôlée , compare l’évolution fonctionnelle et la manométrie anale ainsi que l’évaluation barostatique dans ces deux groupes de patients pendant une période de deux ans . Quarante-deux patients consécutifs ont été inclus , do nt 19 STRAIGHT ( 17 hommes ; âge moyen 62,1 ( ETS : 2,3 ) ans ) et 16 POUCH ( 11 hommes ; âge moyen 61,3 ( 3,2 ) ans ) . Quatre patients sont décédés de métastases ( et deux ont émigré ) , mais il n’y avait aucune morbidité ou de récidive locale . A 6 mois , les patients POUCH allaient significativement moins fréquemment à la selle ( 32,9 ) ( 2,8 ) vs. 49 (l,4)/semaine ; p<0,05 ) et avaient moins de souillures lorsqu’ils passaient des gaz ( 38 % vs. 73,7 % ; p<0,05 ) . A 2 ans , les résultats des deux groupes se sont améliorés avec aucune différence en ce qui concernait la fréquence des selles ( 7,3 ( 0,4 ) vs. 8 (0,2)/semaine ) ou la souillure en passant des gaz ( 38 % vs. 53 % ) . Les problémes de défécation sont restés minimes dans les deux groupes . Les pressions de contraction anale étaient significativement perturbées dans les deux groupes , jusqu’à deux ans ( p<0,05 ) . Le volume rectal maximal tolérable et la compliance n’étaient pas significativement différents entre les deux groupes . Selon les résultats de la barostatique phasique on a mis en évidence une perturbation à 6 mois mais avec une récupération à 2 ans , suggerérant que la récupération postopératoire de nerfs sympathiques afférents joue peut-être un rôle dans la récupération fonctionnelle . En conclusion , à 6 mois , la fréquence des selles et de l’incontinence sont moindres après une anastomose POUCH , mais après 2 ans , les patients ayant une anastomose STRAIGHT ont des résultats similaires . Néanmoins , cet avantage fonctionnel inhérent à l’utilisation de l’anastomose avec réservoir en J ne dem and e qu’un minime effort de plus et l’intervention se complique peu . ResumenExisten pocos trabajos que valoren los result ados funcionales tardíos de la bolsa en J de colon ( POUCH ) con la anastomosis termino-terminal colorrectal , tras resecciones anteriores , muy bajas , de recto . En este estudio controlado y r and omizado , se comparan , tras un seguimiento de 2 años , los result ados funcionales , la manometría anal y la barestesia rectal en dos grupos de pacientes tratados quirúrgicamente , con una de las dos técnicas mencionadas . La población estudiada comprende 42 pacientes ; 19 tratados mediante anastomosis directa ( grupo STRAIGHT ) de los que 17 fueron hombres con una edad media de 62.1 ( SEM : 2.3 ) años ; el otro grupo ( POUCH ) comprende 16 pacientes de los que 11 fueron hombres con edad media de 61.3 ( 3.2 ) años . 4 enfermos murieron como consecuencia de diseminación metastásica y 2 emigraron . No se registró morbilidad quirúrgica alguna , ni recidivas locales . A los 6 meses , los enfermos del grupo POUCH presentaban un número significativamente menor de deposiciones [ 32.9 ( 2.8 ) vs. 49 ( 1.4 ) ] por semana ( p<0.05 ) . y al ventosear dejan escapar menos materia fecal ( 38 % vs 73.7 % ; p<0.05 ) . A los 2 años , los pacientes de ambos grupos mejoraron sin que existieran diferencias ni en el número de deposiciones [ 7.3 ( 0.4 ) vs. 8 ( 0.2 ) ] , ni al mancharse al ventosear ( 38 % vs. 53 % ) . Los problemas de defecación fueron mÍnimos en ambos grupos . Las presiones anales al intento de defecar mejoraron significativamente en ambos grupos , a partir de los 2 años de la intervención ( p<0.05 ) . El volumen máximo tolerable y la “ compliance ” rectal fue igual en los dos grupos . La sensibilidad rectal , detectada mediante un programa fásico barestésico , mejora a los 6 meses y se recupera a los 2 años de la operación , lo que sugiere que en la recuperacion postoperatoria , los nervios simpáticos aferentes no resecados , desempeñar un importante papel en la recuperación funcional . Conclusión : el número de defecaciones y la incontinencia son menores en el grupo POUCH , hasta que transcurren 6 meses de la operación , pero tras un période de adaptación de 2 años , los result ados en el grupo POUCH y en el grupo STRAIGHT son semejantes . A pesar de ello la recuperción funcional es más rápida con la bolsa en J de colon ( POUCH ) y este mayor confort para los enfermos se puede alcanzar con un minimo esfuerzo añadido y sin complicaciones
11,050
24,742,167
Discussion Several studies support the hypothesis that the influence of the childcare environment on children ’s physical activity and diet is moderated by child characteristics ( age , gender ) , but interaction between environmental types as well as between micro-systems is hardly examined in the field of behavioral nutrition and physical activity .
Background The ecological perspective holds that human behavior depends on the interaction of different environmental factors and personal characteristics , but it lacks validation and operationalization . In the current paper , an ecological view was adopted to examine the interactive impact of several ecological systems on children ’s dietary intake and physical activity at childcare or similar facilities .
Objective : To use an ecological systems approach to examine individual- , family- , community- and area-level risk factors for overweight ( including obesity ) in 3-year-old children . Methods : A prospect i ve nationally representative cohort study conducted in Engl and , Wales , Scotl and , Northern Irel and . Participants included 13 188 singleton children aged 3 years in the Millennium Cohort Study , born between 2000 and 2002 , who had complete height/weight data . The main outcome measure was childhood overweight ( including obesity ) defined by the International Obesity TaskForce cut-offs for body mass index . Results : 23.0 % of 3-year-old children were overweight or obese . In the fully adjusted model , primarily individual- and family-level factors were associated with early childhood overweight : birthweight z-score ( adjusted odds ratio , 1.36 , 95 % CI 1.30 to 1.42 ) , black ethnicity ( 1.41 , 1.11 to 1.80 ) ( compared with white ) , introduction to solid foods < 4 months ( 1.12 , 1.02 to 1.23 ) , lone motherhood ( 1.32 , 1.15 to 1.51 ) , smoking during pregnancy ( 1–9 cigarettes daily : 1.34 , 1.17 to 1.54 ; 10–19 : 1.49 , 1.26 to 1.75 ; 20 + : 1.34 , 1.05 to 1.70 ) , parental overweight ( both : 1.89 , 1.63 to 2.19 ; father only : 1.45 , 1.28 to 1.63 ; mother only : 1.37 , 1.18 to 1.58 ) , prepregnancy overweight ( 1.28 , 1.14 to 1.45 ) and maternal employment ⩾21 hours/week ( 1.23 , 1.10 to 1.37 ) ( compared with never worked ) . Breastfeeding ⩾4 months ( 0.86 , 0.76 to 0.97 ) ( compared with none ) and Indian ethnicity ( 0.63 , 0.42 to 0.94 ) were associated with a decreased risk of early childhood overweight . Children from Wales were also more likely to be overweight than children from Engl and . Conclusions : Most risk factors for early childhood overweight are modifiable or would allow at-risk groups to be identified . Policies and interventions should focus on parents and providing them with an environment to support healthy behaviours for themselves and their children OBJECTIVE To investigate the predictive association between preschool childcare arrangements and overweight/obesity in childhood . STUDY DESIGN Children were enrolled in a prospect i ve birth cohort in Quebec , Canada ( n = 1649 ) . Information about childcare obtained via question naires to the mothers at ages 1.5 , 2.5 , 3.5 , and 4 years was used to compute a main childcare arrangement exposure variable ( center-based/family-based/care by a relative/nanny ) . Body mass index was derived from measured weights and heights at ages 4 , 6 , 7 , 8 , and 10 years and children were classified as overweight/obese versus normal weight . Generalized estimating equations were used to model the effect of main childcare arrangement ( center-based/family-based/relative/nanny ) ( vs parental care ) on overweight/obesity adjusting for several potential confounding factors . RESULTS Compared with parental care , children who attended a center-based childcare ( OR : 1.65 , 95 % CI : 1.13 - 2.41 ) or were cared for by a relative ( OR : 1.50 ; 95 % CI : 0.95 - 2.38 , although with greater uncertainty ) had higher odds of being overweight/obese in childhood ( 4 - 10 years ) . Analyses of number of hours additionally suggested that each increment of 5 hours spent in either center-based or relative childcare increased the odds of overweight/obesity in the first decade of life by 9 % . Associations were not explained by a wide range of confounding factors , including socioeconomic position , breastfeeding , maternal employment , and maternal body mass index . CONCLUSION Overweight/obesity was more frequently observed in children who received non-parental care in center-based setting s or care by a relative other than the parent . " Obesogeonic " features of these childcare arrangements should be investigated in future studies Background : In young children , the eating environment is an important social context within which eating behaviors develop . Among many low-income young children , the responsibility for feeding may have shifted from family members to child care providers because these children spend the majority of their day in child care setting s. Methods : To examine the influence of feeding among low-income children in child care setting s , feeding behaviors of child care providers in Head Start were observed and food consumption was assessed . Head Start , a comprehensive child development program that serves children from ages 3 to 5 , was chosen because of the large percentage of minorities , the low-income status of the families , and the age of the children . Fifty child care providers ( 25 African-American ; 25 Hispanic ) r and omly selected from Head Start centers in a large , urban southwestern city were observed on three mealtime occasions and self-reported feeding styles were assessed . Observed feeding behaviors were categorized into four feeding patterns based on their conceptual similarity to a general parenting typology ( i.e. , authoritarian , authoritative , indulgent , and uninvolved ) . Measures of food consumption were assessed on 549 children sitting with the child care providers during lunch at the Head Start centers . Results : Indulgent feeding behaviors were positively related to children ’s consumption of vegetables , dairy , entrée , and starch ; authoritative feeding behaviors were positively related to dairy consumption . Conclusion : This research highlights the important influence that child care providers have in the development of healthy and unhealthy eating behaviors in minority children . Implication s for intervention training for child care providers to promote healthy eating among Head Start children are discussed Background Low levels of physical activity are characteristic in preschoolers . To effectively promote physical activity , it is necessary to underst and factors that influence young children 's physical activity . The present study aim ed to investigate how physical activity levels are influenced by environmental factors during recess in preschool . Methods Preschool playground observations and pedometry during recess were carried out in 39 r and omly selected preschools ( 415 boys and 368 girls ; 5.3 ± 0.4 years old ) . In order to examine the contribution of playground variables to physical activity levels , taking adjustment for clustering of subjects within preschools into account , multilevel analyses were conducted . Results During recess boys took significantly more steps per minute than girls ( 65 ± 36 versus 54 ± 28 steps/min ) . In both genders higher step counts per minute were significantly associated with less children per m2 and with shorter recess times . Only in boys a hard playground surface was a borderline significant predictor for higher physical activity levels . In girls higher step counts were associated with the presence of less supervising teachers . Playground markings , access to toys , the number of playing or aim ing equipment pieces and the presence of vegetation or height differences were not significant physical activity predictors in both genders . Conclusion In preschool children physical activity during outdoor play is associated with modifiable playground factors . Further study is recommended to evaluate if the provision of more play space , the promotion of continued activity by supervisors and the modification of playground characteristics can increase physical activity levels in preschoolers BACKGROUND : The majority of infants in the United States are in nonparental child care , yet little is known about the effect of child care on development of obesity . OBJECTIVE : To examine the relationship between child care attendance from birth to 6 months and adiposity at 1 and 3 years of age . METHODS : We studied 1138 children from a prospect i ve cohort of pregnant women and their offspring . The main exposure was time in child care from birth to 6 months of age , overall and by type of care : ( 1 ) child care center ; ( 2 ) someone else 's home ; and ( 3 ) child 's own home by nonparent . The main outcomes were weight-for-length ( WFL ) z score at 1 year and BMI z score at 3 years of age . RESULTS : A total of 649 ( 57 % ) infants attended child care ; 17 % were cared for in a center , 27 % in someone else 's home , and 21 % in their own home by a nonparent . After adjustment for confounders , overall time in child care was associated with an increased WFL z score at 1 year and BMI z score at 3 years of age but not skinfold thicknesses . Center and own home care were not associated with the outcomes , but care in someone else 's home was associated with an increase in both the 1- and 3-year outcomes . CONCLUSION : Child care in the first 6 months of life , especially in someone else 's home , was associated with an increased WFL z score at 1 year and BMI z score at 3 years of age BACKGROUND Previous research has shown that children in child-care do not comply with dietary intake recommendations ( i.e. either exceeding or not meeting recommendations ) , which may be attributable to specific features of the child-care environment . The present study explored the relationship between the social and physical child-care ( day-care ) environment and dietary intake of 2- and 3-year-olds in Dutch child-care centres . METHODS The dietary intake of 135 children , aged 2 and 3 years , who were in child-care was assessed by observing r and omly selected children at three meals ( morning snack , lunch and afternoon snack ) to determine dietary intake ( i.e. saturated fat , dietary fibre and energy intake ) . The environment was observed using the Environment and Policy Assessment and Observation checklist , a structured instrument assessing the physical and social environment . RESULTS Children consumed a mean of 486 kJ ( 116 kcal ) during the morning snack , 2043 kJ ( 488 kcal ) during lunch and 708 kJ ( 169 kcal ) during the afternoon snack . There were some gender and age differences in dietary intake . Several environmental factors ( e.g. serving style and staff 's model dietary behaviour ) were significantly associated with the children 's dietary intake . CONCLUSIONS Overall , energy intake was in the upper range of recommended intake for children in child-care . The associations of several environmental factors with dietary intake stress the importance of the child-care environment for children 's dietary behaviour . Intervening in this setting could possibly contribute to the comprehensive prevention of childhood obesity OBJECTIVE A r and omized controlled pilot study to test the hypothesis that increasing preschool children 's outdoor free play time increases their daily physical activity levels . METHODS Physical activity was assessed by accelerometers for four consecutive school days in thirty-two Latino children ( 3.6+/-0.5 years ) attending a preschool for low-income families . After two days of baseline physical activity assessment , participants were r and omly assigned to an intervention ( RECESS ; n = 17 ) or control ( CON ; n = 15 ) group . The RECESS group received two additional 30-minute periods of outdoor free play time per day for two days . The CON group followed their normal classroom schedule . Between group differences in physical activity variables were tested with a Wilcoxon rank-sum test . RESULTS There were no statistically significant differences between groups in changes from baseline in average total daily ( CON , 48.2+/-114.5 ; RECESS , 58.2+/-74.6 ) and during school day ( CON , 64.6+/-181.9 ; RECESS , 59.7+/-79.1 ) counts per minute , or total daily ( CON , 0.4+/-1.3 ; RECESS , 0.3+/-0.8 ) and during school day ( CON , 0.6+/-2.1 ; RECESS , 0.5+/-0.8 ) percent of time spent in moderate to vigorous physical activity . CONCLUSIONS Substantially increasing preschoolers ' outdoor free play time did not increase their physical activity levels OBJECTIVES We aim ed to investigate the effects of providing play equipment and markings at the pre-school playground on physical activity engagement levels . METHODS We performed a cluster r and omised control trial . In November and December 2007 , a convenience sample of 40 public pre-schools in Fl and ers , Belgium , was r and omly assigned to one of the following conditions : 1 ) in 10 pre-schools play equipment was provided , 2 ) in 10 pre-schools markings were painted on the playground , 3 ) in 10 schools play equipment was provided and markings were painted , 4 ) 10 schools served as a control condition . Accelerometer-based physical activity levels during recess were evaluated at baseline and 4 to 6 weeks after the implementation of the intervention in 583 children ( 52 % boys ; mean age 5.3 years , SD 0.4 ) . RESULTS At baseline pre-schoolers spent only 11.2 % ( average : 4.7 min ) of recess time in moderate to vigorous activity , while 61.3 % ( average : 25.7 min ) was spent in sedentary activity . The interventions were not effective in increasing the average activity levels or the percentages of engagement in moderate or vigorous activity , or in decreasing sedentary time . CONCLUSION Providing playground markings or play equipment is not sufficient to increase activity levels and decrease levels of sedentary activity during pre-school recess . More activating supervision and the inclusion of more structured physical activity seem needed Objective : We evaluated the effects of a preschool nutrition education and food service intervention “ Healthy Start , ” on two-to-five-year-old children in nine Head Start Centers in upstate NY . The primary objective was to reduce the saturated fat ( sat-fat ) content of preschool meals to < 10 % daily energy ( E ) and to reduce consumption of sat-fat by preschoolers to < 10 % E. Methods : Six centers were assigned to the food service intervention and three to control condition . Food service intervention included training workshops for cooks and monthly site visits to review progress towards goals . Child dietary intake at preschool was assessed by direct observation and plate waste measurement . Dietary intake at home was assessed by parental food record and telephone interviews . Dietary data were collected each Fall/Spring over two years , including five days of menus and recipes from each center . Dietary data were analyzed with the Minnesota NDS software . Results : Consumption of saturated fat from school meals decreased significantly from 11.0%E to 10.4%E after one year of intervention and to 8.0%E after the second year , compared with an increase of 10.2 % to 13.0 % to 11.4%E , respectively , for control schools ( p < 0.001 ) . Total caloric intake was adequately maintained for both groups . Analysis of preschool menus and recipes over the two-year period of intervention showed a significant decrease in sat-fat content in intervention preschools ( from 12.5 at baseline to 8.0%E compared with a change of 12.1%E to > 11.6%E in control preschools ( p < 0.001 ) ) . Total fat content of menus also decreased significantly in intervention schools ( 31.0 % to > 25.0%E ) compared with controls ( 29.9 % to > 28.4%E ) . Conclusions : The Healthy Start food service intervention was effective in reducing the fat and saturated fat content of preschool meals and reducing children ’s consumption of saturated fat at preschool without compromising energy intake or intake of essential nutrients . These goals are consistent with current U.S. Dietary Guidelines for children older than two years of age
11,051
29,417,616
Within-subject comparisons restricted to AMED consumers revealed that alcohol consumption does not significantly differ between typical AMED and AO occasions . On past month heaviest drinking occasions , AMED users consume significantly less alcohol on AMED occasions when compared to AO occasions . AMED consumers experience significantly fewer negative consequences and risk-taking behavior on AMED occasions compared with AO occasions . Meta-analyses of subjective intoxication studies suggest that AMED consumption does not differentially affect subjective intoxication when compared to AO consumption . In conclusion , when compared to AO consumption , mixing alcohol with energy drink does not affect subjective intoxication and seems unlikely to increase total alcohol consumption , associated risk-taking behavior , nor other negative alcohol-related consequences .
The purpose of this systematic review and meta- analysis was to critically review the ( 1 ) prevalence of alcohol mixed with energy drink ( AMED ) consumption , ( 2 ) motives for AMED consumption , ( 3 ) correlates of AMED consumption , and ( 4 ) whether AMED consumption has an impact on ( a ) alcohol consumption , ( b ) subjective intoxication , and ( c ) risk-taking behavior . Overall a minority of the population consumes AMED , typically infrequently . Motives for AMED consumption are predominantly hedonistic and social .
Background Energy drink consumption has continued to gain in popularity since the 1997 debut of Red Bull , the current leader in the energy drink market . Although energy drinks are targeted to young adult consumers , there has been little research regarding energy drink consumption patterns among college students in the United States . The purpose of this study was to determine energy drink consumption patterns among college students , prevalence and frequency of energy drink use for six situations , namely for insufficient sleep , to increase energy ( in general ) , while study ing , driving long periods of time , drinking with alcohol while partying , and to treat a hangover , and prevalence of adverse side effects and energy drink use dose effects among college energy drink users . Methods Based on the responses from a 32 member college student focus group and a field test , a 19 item survey was used to assess energy drink consumption patterns of 496 r and omly surveyed college students attending a state university in the Central Atlantic region of the United States . Results Fifty one percent of participants ( n = 253 ) reported consuming greater than one energy drink each month in an average month for the current semester ( defined as energy drink user ) . The majority of users consumed energy drinks for insufficient sleep ( 67 % ) , to increase energy ( 65 % ) , and to drink with alcohol while partying ( 54 % ) . The majority of users consumed one energy drink to treat most situations although using three or more was a common practice to drink with alcohol while partying ( 49 % ) . Weekly jolt and crash episodes were experienced by 29 % of users , 22 % reported ever having headaches , and 19 % heart palpitations from consuming energy drinks . There was a significant dose effect only for jolt and crash episodes . Conclusion Using energy drinks is a popular practice among college students for a variety of situations . Although for the majority of situations assessed , users consumed one energy drink with a reported frequency of 1 – 4 days per month , many users consumed three or more when combining with alcohol while partying . Further , side effects from consuming energy drinks are fairly common , and a significant dose effect was found with jolt and crash episodes . Future research should identify if college students recognize the amounts of caffeine that are present in the wide variety of caffeine-containing products that they are consuming , the amounts of caffeine that they are consuming in various situations , and the physical side effects associated with caffeine consumption Energy drinks containing caffeine , taurine , and glucose may improve mood and cognitive performance . However , there are no studies assessing the individual and interactive effects of these ingredients . We evaluated the effects of caffeine , taurine , and glucose alone and in combination on cognitive performance and mood in 24-hour caffeine-abstained habitual caffeine consumers . Using a r and omized , double-blind , mixed design , 48 habitual caffeine consumers ( 18 male , 30 female ) who were 24-hour caffeine deprived received one of four treatments ( 200 mg caffeine/0 mg taurine , 0 mg caffeine/2000 mg taurine , 200 mg caffeine/2000 mg taurine , 0 mg caffeine/0 mg taurine ) , on each of four separate days , separated by a 3-day wash-out period . Between- participants treatment was a glucose drink ( 50 g glucose , placebo ) . Salivary cortisol , mood and heart rate were measured . An attention task was administered 30-minutes post-treatment , followed by a working memory and reaction time task 60-minutes post-treatment . Caffeine enhanced executive control and working memory , and reduced simple and choice reaction time . Taurine increased choice reaction time but reduced reaction time in the working memory tasks . Glucose alone slowed choice reaction time . Glucose in combination with caffeine , enhanced object working memory and in combination with taurine , enhanced orienting attention . Limited glucose effects may reflect low task difficulty relative to subjects ' cognitive ability . Caffeine reduced feelings of fatigue and increased tension and vigor . Taurine reversed the effects of caffeine on vigor and caffeine-withdrawal symptoms . No effects were found for salivary cortisol or heart rate . Caffeine , not taurine or glucose , is likely responsible for reported changes in cognitive performance following consumption of energy drinks , especially in caffeine-withdrawn habitual caffeine consumers BACKGROUND There has been a significant growth in the energy drink ( ED ) market in Australia and around the world ; however , most research investigating the popularity of ED and alcohol and energy drink ( AED ) use has focused on specific sub population s such as university students . The aim of this study was to estimate the prevalence , consumption patterns , and sociodemographic correlates of ED and combined AED use among a representative Australian population sample . METHODS A computer-assisted telephone interview survey ( n = 2,000 ) was undertaken in March-April 2013 of persons aged 18 years and over . Half of the interviews were obtained through r and omly generated l and line telephone numbers and half through mobile phones . Approximately half of the sample was female ( 55.5 % ; n = 1,110 ) and the mean age of participants was 45.9 ( range 18 to 95 , SD 20.0 ) . RESULTS Less than 1 in 6 Australians reported ED use ( 13.4 % , n = 268 ) and 4.6 % ( n = 91 ) reported AED use in the past 3 months . Majority of ED and AED users consumed these beverages monthly or less . ED and AED users are more likely to be aged 18 to 24 years , live in a metropolitan area , and be moderate risk or problem gamblers . AED consumers are more likely to report moderate levels of psychological distress . CONCLUSIONS Our findings in relation to problem gambling and psychological distress are novel and require further targeted investigation . Health promotion strategies directed toward reducing ED and AED use should focus on young people living in metropolitan areas and potentially be disseminated through locations where gambling takes place Rationale Combining alcohol and caffeine is associated with increased alcohol consumption , but no prospect i ve experimental studies have examined whether added caffeine increases alcohol consumption . Objectives This study examined how caffeine alters alcohol self-administration and subjective reinforcing effects in healthy adults . Methods Thirty-one participants completed six double-blind alcohol self-administration sessions : three sessions with alcohol only ( e.g. , beverage A ) and three sessions with alcohol and caffeine ( e.g. , beverage B ) . Participants chose which beverage to consume on a subsequent session ( e.g. , beverage A or B ) . The effects of caffeine on overall beverage choice , number of self-administered drinks , subjective ratings ( e.g. , Biphasic Alcohol Effects Scale ) , and psychomotor performance were examined . Results A majority of participants ( 65 % ) chose to drink the alcohol beverage containing caffeine on their final self-administration session . Caffeine did not increase the number of self-administered drinks . Caffeine significantly increased stimulant effects , decreased sedative effects , and attenuated decreases in psychomotor performance attributable to alcohol . Relative to nonchoosers , caffeine choosers reported overall lower stimulant ratings and reported greater drinking behavior prior to the study . Conclusions Although caffeine did not increase the number of self-administered drinks , most participants chose the alcohol beverage containing caffeine . Given the differences in subjective ratings and pre-existing differences in self-reported alcohol consumption for caffeine choosers and nonchoosers , these data suggest that decreased stimulant effects of alcohol and heavier self-reported drinking may predict subsequent choice of combined caffeine and alcohol beverages . These predictors may identify individuals who would benefit from efforts to reduce risk behaviors associated with combining alcohol and caffeine BACKGROUND Well-known reports suggest that the use of energy drinks might reduce the intensity of the depressant effects of alcohol . However , there is little scientific evidence to support this hypothesis . OBJECTIVE AND METHODS The present study aim ed at evaluating the effects of the simultaneous ingestion of an alcohol ( vodka(37.5%v/v ) ) and an energy drink ( Red Bull-3.57 mL/kg ) , compared with those presented after the ingestion of an alcohol or an energy drink alone . Twenty-six young healthy volunteers were r and omly assigned to 2 groups that received 0.6 or 1.0 g/kg alcohol , respectively . They all completed 3 experimental sessions in r and om order , 7 days apart : alcohol alone , energy drink alone , or alcohol plus energy drink . We evaluated the volunteers ' breath alcohol concentration , subjective sensations of intoxication , objective effects on their motor coordination , and visual reaction time . RESULTS When compared with the ingestion of alcohol alone , the ingestion of alcohol plus energy drink significantly reduced subjects ' perception of headache , weakness , dry mouth , and impairment of motor coordination . However , the ingestion of the energy drink did not significantly reduce the deficits caused by alcohol on objective motor coordination and visual reaction time . The ingestion of the energy drink did not alter the breath alcohol concentration in either group . CONCLUSIONS Even though the subjective perceptions of some symptoms of alcohol intoxication were less intense after the combined ingestion of the alcohol plus energy drink , these effects were not detected in objective measures of motor coordination and visual reaction time , as well as on the breath alcohol concentration BACKGROUND Recently , Marczinski and colleagues ( 2013 ) showed that energy drinks combined with alcohol augment a person 's desire to drink more alcohol relative to drinking alcohol alone . The current study replicates the findings of Marczinski and colleagues ( 2013 ) using a robust measure of alcohol craving . METHODS Seventy-five participants aged 18 to 30 years were assigned to an alcohol only or alcohol+energy drink condition in a double-blind r and omized pre- versus posttest experiment . Participants received a cocktail containing either 60 ml of vodka and a Red Bull ( ® ) Silver Edition energy drink ( alcohol+energy drink condition ) or 60 ml of vodka with a soda water vehicle ( alcohol-only condition ) ; both cocktails contained 200 ml of fruit drink . The primary outcome measure was the Alcohol Urge Question naire taken at pretest and at 20 minutes ( posttest ) . Other measures taken at posttest were the Biphasic Alcohol Effects Question naire , the Drug Effects Question naire , and breath alcohol concentration ( BAC ) . RESULTS The alcohol+energy drink condition showed a greater pre- versus posttest increase in urge to drink alcohol compared with the alcohol-only condition ( B = 3.24 , p = 0.021 , d = 0.44 ) . Participants in the alcohol+energy drink condition had significantly higher ratings on liking the cocktail and wanting to drink more of the cocktail , and lower BACs , than the alcohol-only condition . When examined at specific BACs , the effect of the energy drink on the pre- to posttest increase in urge to drink was largest and only significant at BACs of 0.04 - 0.05 ( cf . < 0.04 g/dl).There were no significant differences in stimulation , sedation , feeling the effects of the cocktail , or feeling high . CONCLUSIONS Combining energy drinks with alcohol increased the urge to drink alcohol relative to drinking alcohol alone . More research is needed to underst and what factors mediate this effect and whether it increases subsequent alcohol consumption OBJECTIVES The consumption of alcohol mixed with energy drinks ( AmED ) is popular on college campuses in the United States . Limited research suggests that energy drink consumption lessens subjective intoxication in persons who also have consumed alcohol . This study examines the relationship between energy drink use , high-risk drinking behavior , and alcohol-related consequences . METHODS In Fall 2006 , a Web-based survey was conducted in a stratified r and om sample of 4,271 college students from 10 universities in North Carolina . RESULTS A total of 697 students ( 24 % of past 30-day drinkers ) reported consuming AmED in the past 30 days . Students who were male , white , intramural athletes , fraternity or sorority members or pledges , and younger were significantly more likely to consume AmED . In multivariable analyses , consumption of AmED was associated with increased heavy episodic drinking ( 6.4 days vs. 3.4 days on average ; p < 0.001 ) and twice as many episodes of weekly drunkenness ( 1.4 days/week vs. 0.73 days/week ; p < 0.001 ) . Students who reported consuming AmED had significantly higher prevalence of alcohol-related consequences , including being taken advantage of sexually , taking advantage of another sexually , riding with an intoxicated driver , being physically hurt or injured , and requiring medical treatment ( p < 0.05 ) . The effect of consuming AmED on driving while intoxicated depended on a student 's reported typical alcohol consumption ( interaction p = 0.027 ) . CONCLUSIONS Almost one-quarter of college student current drinkers reported mixing alcohol with energy drinks . These students are at increased risk for alcohol-related consequences , even after adjusting for the amount of alcohol consumed . Further research is necessary to underst and this association and to develop targeted interventions to reduce risk OBJECTIVE This study examines the use of alcohol mixed with energy drinks ( AmED ) as a predictor of alcohol problems and alcohol-related consequences and accidents two years later in a college student sample . METHOD Longitudinal data on AmED use , alcohol consequences , and alcohol problems were collected from the fall of students ' second year of college to the fall of their fourth year ( N = 620 , 49 % male ) . RESULTS After we controlled for demographic indicators and heavy episodic drinking , AmED use was a consistent predictor of negative alcohol-related outcomes 2 years later . Compared with no AmED use , both infrequent ( i.e. , one to three times per month ) and frequent ( i.e. , one or more times per week ) AmED use were associated with an increased risk of negative alcohol consequences and harmful/hazardous alcohol use ( ≥8 on Alcohol Use Disorders Identification Test [ AUDIT ] ) . Frequent AmED use was also associated with serious alcohol problems ≥16 on AUDIT ) and an increased risk of alcohol-related accidents in the subsequent 2 years . CONCLUSIONS Prospect i ve risks of alcohol consequences related to AmED use suggest a continued need for research and policy to address the surveillance , etiology , and prevention of AmED use BACKGROUND The mixing of alcohol and energy drinks ( AMEDs ) is a trend among college students associated with higher rates of heavy episodic drinking and negative alcohol-related consequences . The goals of this study were to take a person-centered approach to identify distinct risk profiles of college students based on AMED-specific constructs ( expectancies , attitudes , and norms ) and examine longitudinal associations between AMED use , drinking , and consequences . METHODS A r and om sample of incoming freshmen ( n = 387 , 59 % female ) completed measures of AMED use , AMED-specific expectancies , attitudes , and normative beliefs , and drinking quantity and alcohol-related consequences . Data were collected at 2 occasions : spring semester of freshmen year and fall semester of sophomore year . RESULTS Latent profile analysis identified 4 subgroups of individuals : occasional AMED , anti-AMED , pro-AMED , and strong peer influence . Individuals in the pro-AMED group reported the most AMED use , drinking , and consequences . There was a unique association between profile membership and AMED use , even after controlling for drinking . CONCLUSIONS Findings highlighted the importance of AMED-specific expectancies , attitudes , and norms . The unique association between AMED risk profiles and AMED use suggests AMED use is a distinct behavior that could be targeted by AMED-specific messages included in existing brief interventions for alcohol use Recent studies suggest that the combination of caffeine-containing drinks together with alcohol might reduce the subjective feelings of alcohol intoxication — the so-called “ masking effect ” . In this study , we aim ed to review the effects of alcohol in combination with caffeine or energy drink with special focus on the “ masking effect ” . Fifty-two healthy male volunteers were analysed concerning breath alcohol concentration and subjective sensations of intoxication using a 18 item Visual Analogue Scale in a r and omised , double-blinded , controlled , four treatments cross-over trial after consumption of ( A ) placebo , ( B ) alcohol ( vodka 37.5 % at a dose of 46.5 g ethanol ) , ( C ) alcohol in combination with caffeine at a dose of 80 mg ( equivalent to one 250 ml can of energy drink ) and ( D ) alcohol in combination with energy drink at a dose of 250 ml ( one can ) . Primary variables were headache , weakness , salivation and motor coordination . Out of four primary variables , weakness and motor coordination showed a statistically significant difference between alcohol and non-alcohol group , out of 14 secondary variables , five more variables ( dizziness , alterations in sight , alterations in walking , agitation and alterations in speech ) also showed significant differences due mainly to contrasts with the non-alcohol group . In none of these end points , could a statistically significant effect be found for the additional ingestion of energy drink or caffeine on the subjective feelings of alcohol intoxication . This within-subjects study does not confirm the presence of a “ masking effect ” when combining caffeine or energy drink with alcohol Data from the National Health and Nutrition Examination Survey ( NHANES ) indicate 89 % of Americans regularly consume caffeine , but these data do not include military personnel . This cross-sectional study examined caffeine use in Navy and Marine Corps personnel , including prevalence , amount of daily consumption , and factors associated with use . A r and om sample of Navy and Marine Corps personnel was contacted and asked to complete a detailed question naire describing their use of caffeine-containing substances , in addition to their demographic , military , and lifestyle characteristics . A total of 1708 service members ( SMs ) completed the question naire . Overall , 87 % reported using caffeinated beverages ≥1 time/week , with caffeine users consuming a mean ± st and ard error of 226 ± 5 mg/day ( 242 ± 7 mg/day for men , 183 ± 8 mg/day for women ) . The most commonly consumed caffeinated beverages ( % users ) were coffee ( 65 % ) , colas ( 54 % ) , teas ( 40 % ) , and energy drinks ( 28 % ) . Multivariable logistic regression modeling indicated that characteristics independently associated with caffeine use ( ≥1 time/week ) included older age , white race/ethnicity , higher alcohol consumption , and participating in less resistance training . Prevalence of caffeine use in these SMs was similar to that reported in civilian investigations , but daily consumption ( mg/day ) was higher Rationale Both glucose and caffeine can improve aspects of cognitive performance and , in the case of caffeine , mood . There are few studies investigating the effects of the two substances in combination . Objectives We assessed the mood , cognitive and physiological effects of a soft drink containing caffeine and glucose as well as flavouring levels of herbal extracts . The effects of different drink fractions were also evaluated . Methods Using a r and omised , double-blind , balanced , five-way crossover design , 20 participants who were overnight fasted and caffeine-deprived received 250 ml drinks containing 37.5 g glucose ; 75 mg caffeine ; ginseng and ginkgo biloba at flavouring levels ; a whole drink ( containing all these substances ) or a placebo ( vehicle ) . Participants were assessed in each drink condition , separated by a 7-day wash-out period . Cognitive , psychomotor and mood assessment took place immediately prior to the drink then 30 min thereafter . The primary outcome measures included five aspects of cognitive performance from the Cognitive Drug Research assessment battery . Mood , heart rate and blood glucose levels were also monitored . Results Compared with placebo , the whole drink result ed in significantly improved performance on “ secondary memory ” and “ speed of attention ” factors . There were no other cognitive or mood effects . Conclusions This pattern of results would not be predicted from the effects of glucose and caffeine in isolation , either as seen here or from the literature addressing the effects of the substances in isolation . These data suggest that there is some degree of synergy between the cognition-modulating effects of glucose and caffeine which merits further investigation BACKGROUND There has been a dramatic rise in the consumption of alcohol mixed with energy drinks ( AmED ) in young people . AmED have been implicated in risky drinking practice s and greater accidents and injuries have been associated with their consumption . Despite the increased popularity of these beverages ( e.g. , Red Bull and vodka ) , there is little laboratory research examining how the effects of AmED differ from alcohol alone . This experiment was design ed to investigate if the consumption of AmED alters neurocognitive and subjective measures of intoxication compared with the consumption of alcohol alone . METHODS Participants ( n=56 ) attended 1 session where they were r and omly assigned to receive one of 4 doses ( 0.65 g/kg alcohol , 3.57 ml/kg energy drink , AmED , or a placebo beverage ) . Performance on a cued go/no-go task was used to measure the response of inhibitory and activational mechanisms of behavioral control following dose administration . Subjective ratings of stimulation , sedation , impairment , and level of intoxication were recorded . RESULTS Alcohol alone impaired both inhibitory and activational mechanisms of behavioral control , as evidence d by increased inhibitory failures and increased response times compared to baseline performance . Coadministration of the energy drink with alcohol counteracted some of the alcohol-induced impairment of response activation , but not response inhibition . For subjective effects , alcohol increased ratings of stimulation , feeling the drink , liking the drink , impairment , and level of intoxication , and alcohol decreased the rating of ability to drive . Coadministration of the energy drink with alcohol increased self-reported stimulation , but result ed in similar ratings of the other subjective effects as when alcohol was administered alone . CONCLUSIONS An energy drink appears to alter some of the objective and subjective impairing effects of alcohol , but not others . Thus , AmED may contribute to a high-risk scenario for the drinker . The mix of impaired behavioral inhibition and enhanced stimulation is a combination that may make AmED consumption riskier than alcohol consumption alone The consumption of alcohol mixed with energy drinks ( AmED ) has become a popular and controversial practice among young people . Increased rates of impaired driving and injuries have been associated with AmED consumption . The purpose of this study was to examine if the consumption of AmED alters cognitive processing and subjective measures of intoxication compared with the consumption of alcohol alone . Eighteen participants ( nine men and nine women ) attended four test sessions where they received one of four doses in r and om order ( 0.65 g/kg alcohol , 3.57 ml/kg energy drink , AmED , or a placebo beverage ) . Performance on a psychological refractory period ( PRP ) task was used to measure dual-task information processing and performance on the Purdue pegboard task was used to measure simple and complex motor coordination following dose administration . In addition , various subjective measures of stimulation , sedation , impairment , and level of intoxication were recorded . The results indicated that alcohol slowed dual-task information processing and impaired simple and complex motor coordination . The coadministration of the energy drink with alcohol did not alter the alcohol-induced impairment on these objective measures . For subjective effects , alcohol increased various ratings indicative of feelings of intoxication . More importantly , coadministration of the energy drink with alcohol reduced perceptions of mental fatigue and enhanced feelings of stimulation compared to alcohol alone . In conclusion , AmED may contribute to a high-risk scenario for a drinker . The mix of behavioral impairment with reduced fatigue and enhanced stimulation may lead AmED consumers to erroneously perceive themselves as better able to function than is actually the case OBJECTIVE The consumption of alcohol mixed with energy drinks ( AmEDs ) is a form of risky drinking among college students , a population already in danger of heavy drinking and associated consequences . The goals of the current longitudinal study were to ( a ) identify types of AmED users between the first and second year of college and ( b ) examine differences among these groups in rates of highrisk drinking and consequences over time . METHOD A r and om sample of college student drinkers ( n = 1,710 ; 57.7 % female ) completed baseline and 6-month follow-up measures assessing alcohol-related behaviors . RESULTS AmED use was endorsed by 40 % of participants during the course of the study . As anticipated , four distinct groups of AmED users were identified ( nonusers , initiators , discontinuers , and continuous users ) and were significantly different from one another on drinking and consequence outcomes . Further , significant Time × Group interaction effects were observed for drinking and overall consequences . Generally , across all outcomes and time points , nonusers reported the lowest rates of drinking and consequences , whereas continuous users consistently reported the highest rates of drinking and consequences . Students who initiated AmED use during the course of the study also reported anabrupt increase in alcohol use and reported consequences . CONCLUSIONS Findings suggest students who consistently engage in and initiate AmED use also engage in riskier drinking behaviors and experience higher rates of consequences . Interventions that specifically target AmED use may be warranted and have the potential to reduce alcohol-related consequences BACKGROUND There has been a dramatic rise in the consumption of alcohol mixed with energy drinks ( AmEDs ) in social drinkers . It has been suggested that AmED beverages might lead individuals to drink greater quantities of alcohol . This experiment was design ed to investigate whether the consumption of AmEDs would alter alcohol priming ( i.e. , increasing ratings of wanting another drink ) compared with alcohol alone . METHODS Participants ( n = 80 ) of equal gender attended 1 session where they were r and omly assigned to receive 1 of 4 doses ( 0.91 ml/kg vodka , 1.82 ml/kg energy drink , 0.91 ml/kg vodka mixed with 1.82 ml/kg energy drink [ AmED ] , or a placebo beverage ) . Alcohol-induced priming of the motivation to drink was assessed by self-reported ratings on the Desire for Drug question naire . RESULTS The priming dose of alcohol increased the subjective ratings of " desire " for more alcohol , consistent with previous research that small doses of alcohol can increase the motivation to drink . Furthermore , higher desire ratings over time were observed with AmEDs compared with alcohol alone . Finally , ratings of liking the drink were similar for the alcohol and AmED conditions . CONCLUSIONS An energy drink may elicit increased alcohol priming . This study provides laboratory evidence that AmED beverages may lead to greater motivation to drink versus the same amount of alcohol consumed alone OBJECTIVE We examined the sociodemographic correlates of energy drink use and the differences between those who use them with and without alcohol in a representative community sample . METHODS A r and om-digit-dial l and line telephone survey of adults in the Milwaukee , Wisconsin area responded to questions about energy drink and alcohol plus energy drink use . RESULTS Almost one-third of respondents consumed at least one energy drink in their lifetime , while slightly over 25 % used energy drinks in the past year and 6 % were past-year alcohol plus energy drink users . There were important racial/ethnic differences in consumption patterns . Compared to non-users , past-year energy drink users were more likely to be non-Black minorities ; and past-year alcohol plus energy drink users when compared to energy drink users only were more likely to be White and younger . Alcohol plus energy drink users also were more likely to be hazardous drinkers . CONCLUSIONS Our results which are among the first from a community sample suggest a bifurcated pattern of energy drink use highlighting important population consumption differences between users of energy drinks only and those who use alcohol and energy drinks together BACKGROUND Previous laboratory research on alcohol absorption has found that substitution of artificially sweetened alcohol mixers for sucrose-based mixers has a marked effect on the rate of gastric emptying , result ing in elevated blood alcohol concentrations . Studies conducted in natural drinking setting s , such as bars , have indicated that caffeine ingestion while drinking is associated with higher levels of intoxication . To our knowledge , research has not examined the effects of alcohol mixers that contain both an artificial sweetener and caffeine , that is , diet cola . Therefore , we assessed the event-specific association between diet cola consumption and alcohol intoxication in bar patrons . We sought to determine whether putative increases in blood alcohol , produced by accelerated gastric emptying following diet cola consumption , as identified in the laboratory , also appear in a natural setting associated with impaired driving . METHODS We conducted a secondary analysis of data from 2 nighttime field studies that collected anonymous information from 413 r and omly selected bar patrons in 2008 and 2010 . Data sets were merged and recoded to distinguish between energy drink , regular cola , diet cola , and noncaffeinated alcohol mixers . RESULTS Caffeinated alcohol mixers were consumed by 33.9 % of the patrons . Cola-caffeinated mixed drinks were much more popular than those mixed with energy drinks . A large majority of regular cola-caffeinated mixed drink consumers were men ( 75 % ) , whereas diet cola-caffeinated mixed drink consumers were more likely to be women ( 57 % ) . After adjusting for the number of drinks consumed and other potential confounders , number of diet cola mixed drinks had a significant association with patron intoxication ( β = 0.233 , p < 0.0001 ) . Number of drinks mixed with regular ( sucrose-sweetened ) cola and energy drinks did not have significant associations with intoxication ( p > 0.05 ) . CONCLUSIONS Caffeine 's effect on intoxication may be most pronounced when mixers are artificially sweetened , that is , lack sucrose which slows the rate of gastric emptying of alcohol . Risks associated with on-premise drinking may be reduced by greater attention given to types of mixers , particularly diet colas BACKGROUND It has been argued that the excessive consumption of energy drinks ( EDs ) may have serious health consequences , and that may serve as an indicator for substance use and other risky behaviors . The present paper offers a perspective on this topic that remains underexplored on the population of adolescents . METHODS Data were collected via self-administered anonymous question naires from 870 adolescents aged 15 to 19 years who were recruited from a r and om sample of public secondary schools in the geographic area of the Calabria Region , in the South of Italy . RESULTS A total of 616 participants completed the survey for a response rate of 70.8 % . Nearly 68 % of respondents had drunk at least a whole can of ED during their life , and about 55 % reported consuming EDs during the 30 days before the survey . Only 13 % of interviewed adolescents were aware that drinking EDs is the same as drinking coffee , whereas a sizable percentage believed that drinking EDs is the same as drinking carbonated beverages or rehydrating sport drinks . Forty-six percent of adolescents had drunk alcohol-mixed energy drinks ( AmEDs ) during their life , and 63 % of lifetime users admitted drinking AmEDs during the 30 days before the survey . Overall , 210 ( 63.3 % ) had drunk alcohol alone not mixed with EDs during their life , and more than half ( 56.3 % ) reported having consumed it at least once during the 30 days before the survey . Multivariate analysis showed that the factors independently associated with the consumption of AmEDs were the increasing number of sexual partners , being a current smoker , being male , riding with a driver who had been drinking alcohol , and having used marijuana . CONCLUSIONS Comprehensive educational programs among youths focusing on potential health effects of EDs , alcohol , and the combination of the two , design ed to empower the ability to manage these drinking habits , are strongly advisable BACKGROUND Previous research on alcohol mixed with energy drinks ( AmED ) has shown that use is typically driven by hedonistic , social , functional , and intoxication-related motives , with differential associations with alcohol-related harm across these constructs . There has been no research looking at whether there are subgroups of consumers based on patterns of motivations . Consequently , the aims were to determine the typology of motivations for AmED use among a community sample and to identify correlates of subgroup membership . In addition , we aim ed to determine whether this structure of motivations applied to a university student sample . METHODS Data were used from an Australian community sample ( n = 731 ) and an Australian university student sample ( n = 594 ) who were identified as AmED consumers when completing an online survey about their alcohol and ED use . Participants reported their level of agreement with 14 motivations for AmED use ; latent classes of AmED consumers were identified based on patterns of motivation endorsement using latent class analysis . RESULTS A 4-class model was selected using data from the community sample : ( i ) taste consumers ( 31 % ) : endorsed pleasurable taste ; ( ii ) energy-seeking consumers ( 24 % ) : endorsed functional and taste motives ; ( iii ) hedonistic consumers ( 33 % ) : endorse pleasure and sensation-seeking motives , as well as functional and taste motives ; and ( iv ) intoxication-related consumers ( 12 % ) : endorsed motives related to feeling in control of intoxication , as well as hedonistic , functional , and taste motives . The consumer subgroups typically did not differ on demographics , other drug use , alcohol and ED use , and AmED risk taking . The patterns of motivations for the 4-class model were similar for the university student sample . CONCLUSIONS This study indicated the existence of 4 subgroups of AmED consumers based on their patterns of motivations for AmED use consistently structured across the community and university student sample . These findings lend support to the growing conceptualization of AmED consumers as a heterogeneous group in regard to motivations for use , with a hierarchical and cumulative class order in regard to the number of types of motivation for AmED use . Prospect i ve research may endeavor to link session-specific motives and outcomes , as it is apparent that primary consumption motives may be fluid between sessions
11,052
26,192,056
A majority of the studies revealed significant positive effects on health outcomes associated with digital videogame play among older adults .
OBJECTIVE This article is a systematic review conducted of the research literature on digital videogames played by older adults and health outcomes associated with game play . Findings from each study meeting the inclusion criteria were analyzed and summarized into emergent themes to determine the impact of digital games in promoting healthy behaviors among older adults .
AIM To evaluate the effects of ' ' Playstation EyeToy Games ' ' on upper extremity motor recovery and upper extremity-related motor functioning of patients with subacute stroke . METHODS The authors design ed a r and omized , controlled , assessor-blinded , 4-week trial , with follow-up at 3 months . A total of 20 hemiparetic in patients ( mean age 61.1 years ) , all within 12 months post-stroke , received 30 minutes of treatment with ' ' Playstation EyeToy Games ' ' per day , consisting of flexion and extension of the paretic shoulder , elbow and wrist as well as abduction of the paretic shoulder or placebo therapy ( watching the games for the same duration without physical involvement into the games ) in addition to conventional program , 5 days a week , 2 - 5 hours/day for 4 weeks . Brunnstrom 's staging and self-care sub-items of the functional independence measure ( FIM ) were performed at 0 month ( baseline ) , 4 weeks ( post-treatment ) , and 3 months ( follow-up ) after the treatment . RESULTS The mean change score ( 95 % confidence interval ) of the FIM self-care score ( 5.5 [ 2.9 - 8.0 ] vs 1.8 [ 0.1 - 3.7 ] , P=0.018 ) showed significantly more improvement in the EyeToy group compared to the control group . No significant differences were found between the groups for the Brunnstrom stages for h and and upper extremity . CONCLUSION ' ' Playstation EyeToy Games ' ' combined with a conventional stroke rehabilitation program have a potential to enhance upper extremity-related motor functioning in subacute stroke patients In a multiple-object tracking ( MOT ) task , young and older adults attentively tracked a subset of 10 identical , r and omly moving disks for several seconds , and then tried to identify those disks that had comprised the subset . Young adults who habitually played video games performed significantly better than those who did not . Compared to young subjects ( mean age = 20.6 years ) with whom they were matched for video game experience , older subjects ( mean age = 75.3 years ) showed much reduced ability to track multiple moving objects , particularly with faster movement or longer tracking times . Control measurements with stationary disks show that the age-related decline in MOT was not caused by a general change in memory per se . To generate an item-wise performance measure , we examined older subjects ' proportion correct according to the serial order in which individual disks were identified . Correct identification of target disks declined with the order in which targets were reported , suggesting that attentional tracking produced grade d , rather than all-or-none , outcomes Background Evidence suggests that increasing intensity of rehabilitation results in better motor recovery . Limited evidence is available on the effectiveness of an interactive virtual reality gaming system for stroke rehabilitation . EVREST was design ed to evaluate feasibility , safety and efficacy of using the Nintendo Wii gaming virtual reality ( VRWii ) technology to improve arm recovery in stroke patients . Methods Pilot r and omized study comparing , VRWii versus recreational therapy ( RT ) in patients receiving st and ard rehabilitation within six months of stroke with a motor deficit of ≥3 on the Chedoke-McMaster Scale ( arm ) . In this study we expect to r and omize 20 patients . All participants ( age 18–85 ) will receive customary rehabilitative treatment consistent of a st and ardized protocol ( eight sessions , 60 min each , over a two-week period ) . Outcome measures The primary feasibility outcome is the total time receiving the intervention . The primary safety outcome is the proportion of patients experiencing intervention-related adverse events during the study period . Efficacy , a secondary outcome measure , will be measured by the Wolf Motor Function Test , Box and Block Test , and Stroke Impact Scale at the four-week follow-up visit . From November , 2008 to September , 2009 21 patients were r and omized to VRWii or RT . Mean age , 61 ( range 41–83 ) years . Mean time from stroke onset 25 ( range 10–56 ) days . Conclusions EVREST is the first r and omized parallel controlled trial assessing the feasibility , safety , and efficacy of virtual reality using Wii gaming technology in stroke rehabilitation . The results of this study will serve as the basis for a larger multicentre trial . Clinical Trials.gov registration # Objective . To investigate the effectiveness of computerized virtual reality ( VR ) training of the hemiparetic h and of patients poststroke using a system that provides repetitive motor reeducation and skill reacquisition . Methods . Eight subjects in the chronic phase poststroke participated in a 3-week program using their hemiparetic h and in a series of interactive computer games for 13 days of training , weekend breaks , and pretests and posttests . Each subject trained for about 2 to 2.5 h per day . Outcome measures consisted of changes in the computerized measures of thumb and finger range of motion , thumb and finger velocity , fractionation ( the ability to move fingers independently ) , thumb and finger strength , the Jebsen Test of H and Function , and a Kinematic reach to grasp test . Results . Subjects as a group improved in fractionation of the fingers , thumb and finger range of motion , and thumb and finger speed , retaining those gains at the 1-week retention test . Transfer of these improvements was demonstrated through changes in the Jebsen Test of H and Function and a decrease after the therapy in the overall time from h and peak velocity to the moment when an object was lifted from the table . Conclusions . It is difficult in current service delivery models to provide the intensity of practice that appears to be needed to effect neural reorganization and functional changes poststroke . Computerized exercise systems may be a way to maximize both the patients ’ and the clinicians ’ time . The data in this study add support to the proposal to explore novel technologies for incorporation into current practice The combination of active video gaming and exercise ( exergaming ) is suggested to improve elderly people 's balance , thereby decreasing fall risk . Exergaming has been shown to increase motivation during exercise therapy , due to the enjoyable and challenging nature , which could support long-term adherence for exercising balance . However , scarce evidence is available of the direct effects of exergaming on postural control . Therefore , the aim of the study was to assess the effect of a six-week videogame-based exercise program aim ed at improving balance in elderly people . Task performance and postural control were examined using an interrupted time series design . Results of multilevel analyses showed that performance on the dot task improved within the first two weeks of training . Postural control improved during the intervention . After the intervention period task performance and balance were better than before the intervention . Results of this study show that healthy elderly can benefit from a videogame-based exercise program to improve balance and that all subjects were highly motivated to exercise balance because they found gaming challenging and enjoyable The aims of this r and omized , single-blind crossover trial were to investigate the effect of adding a simulated bowling video game via the Nintendo Wii ® gaming system to the st and ard exercise regimen of cognitively intact residents of long-term care ( LTC ) with upper extremity dysfunction and to identify individual characteristics that might predict improvement . Residents ( n=34 ) were recruited through two LTC facilities in southwestern Ontario and were r and omized into a st and ard exercise ( SG ) or st and ard exercise plus Wii bowling ( Wii ) arm . After 4 weeks of intervention , the groups were crossed over to the opposite arm . Outcomes included measures of pain intensity and bothersomeness , physical activity enjoyment , and a six-item measure of functional capacity design ed specifically for residents of LTC . Results suggest that subjects improved on all outcomes from pre- to postintervention but that only enjoyment of activity showed a significant difference between the SG and Wii groups . Effect sizes ( Cohen 's d ) ranged from small ( 0.30 for bothersomeness ) to large ( 1.77 for functional capacity ) . Responders , defined as those subjects who reported any degree of improvement following the Wii intervention , were less likely to complain of stiffness or shoulder symptoms and were more likely to complain of h and symptoms than non-responders . Limitations in interpretation and recommendations for future research are presented Background Physical activity promotes health in older adults but participation rates are low . Interactive video dance games can increase activity in young persons but have not been design ed for use with older adults . The purpose of this research was to evaluate healthy older adults ’ interest and participation in a dance game adapted for an older user . Methods Healthy older adults were recruited from 3 senior living setting s and offered three months of training and supervision using a video dance game design ed for older people . Before and after the program , data was collected on vital signs , physical function and self reported quality of life . Feedback was obtained during and after training . Results Of 36 persons who entered ( mean age 80.1 + 5.4 years , 83 % female ) , 25 completed the study . Completers were healthier than noncompleters . Completers showed gains in narrow walk time , self-reported balance confidence and mental health . While there were no serious adverse events , 4 of 11 noncompleters withdrew due to musculoskeletal complaints . Conclusions Adapted Interactive video dance is feasible for some healthy older adults and may help achieve physical activity goals The objective of this study was to investigate the potential of using a low-cost video-capture virtual reality ( VR ) platform , the Sony PlayStation II EyeToy , for the rehabilitation of older adults with disabilities . This article presents three studies that were carried out to provide information about the EyeToy 's potential for use in rehabilitation . The first study included the testing of healthy young adults ( N = 34 ) and compared their experiences using the EyeToy with those using GestureTek 's IREX VR system in terms of a sense of presence , level of enjoyment , control , success , and perceived exertion . The second study aim ed to characterize the VR experience of healthy older adults ( N = 10 ) and to determine the suitability and usability of the EyeToy for this population and the third study aim ed to determine the feasibility of the EyeToy for use by individuals ( N = 12 ) with stroke at different stages . The implication s of these three studies for applying the system to rehabilitation are discussed Background Due to the many problems associated with reduced balance and mobility , providing an effective and engaging rehabilitation regimen is essential to progress recovery from impairments and to help prevent further degradation of motor skills . Objectives The purpose of this study was to examine the feasibility and benefits of physical therapy based on a task-oriented approach delivered via an engaging , interactive video game paradigm . The intervention focused on performing targeted dynamic tasks , which included reactive balance controls and environmental interaction . Design This study was a r and omized controlled trial . Setting The study was conducted in a geriatric day hospital . Participants Thirty community-dwelling and ambulatory older adults attending the day hospital for treatment of balance and mobility limitations participated in the study . Interventions Participants were r and omly assigned to either a control group or an experimental group . The control group received the typical rehabilitation program consisting of strengthening and balance exercises provided at the day hospital . The experimental group received a program of dynamic balance exercises coupled with video game play , using a center-of-pressure position signal as the computer mouse . The tasks were performed while st and ing on a fixed floor surface , with progression to a compliant sponge pad . Each group received 16 sessions , scheduled 2 per week , with each session lasting 45 minutes . Measurements Data for the following measures were obtained before and after treatment : Berg Balance Scale , Timed “ Up & Go ” Test , Activities-specific Balance Confidence Scale , modified Clinical Test of Sensory Interaction and Balance , and spatiotemporal gait variables assessed in an instrumented carpet system test . Results Findings demonstrated significant improvements in posttreatment balance performance scores for both groups , and change scores were significantly greater in the experimental group compared with the control group . No significant treatment effect was observed in either group for the Timed “ Up & Go ” Test or spatiotemporal gait variables . Limitations The sample size was small , and there were group differences at baseline in some performance measures . Conclusion Dynamic balance exercises on fixed and compliant sponge surfaces were feasibly coupled to interactive game-based exercise . This coupling , in turn , result ed in a greater improvement in dynamic st and ing balance control compared with the typical exercise program . However , there was no transfer of effect to gait function OBJECTIVE To determine whether a dance mat test of choice stepping reaction time ( CSRT ) is reliable and can detect differences in fall risk in older adults . DESIGN R and omized order , crossover comparison . SETTING Balance laboratory , medical research institute , and retirement village . PARTICIPANTS Older ( mean age , 78.87±5.90y ; range , 65 - 90y ) independent-living people ( N=47 ) able to walk in place without assistance . INTERVENTIONS Not applicable . MAIN OUTCOME MEASURES Reaction ( RT ) , movement , and response times of dance pad -- based stepping tests , Physiological Profile Assessment ( PPA ) score , Digit Symbol Substitution Test ( DSST ) score , time to complete the Trail Making Test ( TMT ) A+B , Fall Efficacy Scale International ( FES-I ) score , Activities-specific Balance Confidence ( ABC ) Scale score , and Incidental and Planned Exercise Question naire ( IPEQ ) incidental IPEQ activity subscore . RESULTS Test-retest reliability of the dance mat CSRT response time was high ( intraclass correlation coefficient model 3,k=.90 ; 95 % confidence interval [ CI ] , .82-.94 ; P<.001 ) and correlated highly with the existing laboratory-based measure ( r=.86 ; 95 % CI , .75-.92 ; P<.001 ) . Concurrent validity was shown by significant correlations between response time and measures of fall risk ( PPA : r=.42 ; 95 % CI , .15-.63 ; P<.01 ; TMT A : r=.61 ; 95 % CI , .39-.77 ; TMT B : r=.55 ; 95 % CI , .31-.72 ; DSST : r=-.53 ; 95 % CI , -.71 to -.28 ; P<.001 ; FES-I : Spearman ρ=.50 ; 95 % CI , .25-.69 ; ABC Scale : Spearman ρ=-.58 ; 95 % CI , -.74 to -.35 ; P<.01 ) . Participants with moderate/high fall-risk scores ( PPA score > 1 ) had significantly slower response times than people with low/mild fall-risk scores ( PPA score < 1 ) at 1146±182 and 1010±132ms , respectively ( P=.005 ) , and multiple fallers and single/nonfallers showed significant differences in RT ( 883±137 vs 770±100ms ; P=.009 ) and response time ( 1180±195 vs 1031±145ms ; P=0.017 ) . CONCLUSIONS The new dance mat device is a valid and reliable tool for assessing stepping ability and fall risk in older community-dwelling people . Because it is highly portable , it can be used in clinic setting s and the homes of older people as both an assessment and training device OBJECTIVES Subsyndromal depression ( SSD ) is several times more common than major depression in older adults and is associated with significant negative health outcomes . Physical activity can improve depression , but adherence is often poor . The authors assessed the feasibility , acceptability , and short-term efficacy and safety of a novel intervention using exergames ( entertaining video games that combine game play with exercise ) for SSD in older adults . METHODS Community-dwelling older adults ( N = 19 , aged 63 - 94 years ) with SSD participated in a 12-week pilot study ( with follow-up at 20 - 24 weeks ) of Nintendo 's Wii sports , with three 35-minute sessions a week . RESULTS Eight-six percent of enrolled participants completed the 12-week intervention . There was a significant improvement in depressive symptoms , mental health-related quality of life ( QoL ) , and cognitive performance but not physical health-related QoL. There were no major adverse events , and improvement in depression was maintained at follow-up . CONCLUSIONS The findings provide preliminary indication of the benefits of exergames in seniors with SSD . R and omized controlled trials of exergames for late-life SSD are warranted Background : Many studies have suggested that cognitive training can result in cognitive gains in healthy older adults . We investigated whether personalized computerized cognitive training provides greater benefits than those obtained by playing conventional computer games . Methods : This was a r and omized double-blind interventional study . Self-referred healthy older adults ( n = 155 , 68 ± 7 years old ) were assigned to either a personalized , computerized cognitive training or to a computer games group . Cognitive performance was assessed at baseline and after 3 months by a neuropsychological assessment battery . Differences in cognitive performance scores between and within groups were evaluated using mixed effects models in 2 approaches : adherence only ( AO ; n = 121 ) and intention to treat ( ITT ; n = 155 ) . Results : Both groups improved in cognitive performance . The improvement in the personalized cognitive training group was significant ( p < 0.03 , AO and ITT approaches ) in all 8 cognitive domains . However , in the computer games group it was significant ( p < 0.05 ) in only 4 ( AO ) or 6 domains ( ITT ) . In the AO analysis , personalized cognitive training was significantly more effective than playing games in improving visuospatial working memory ( p = 0.0001 ) , visuospatial learning ( p = 0.0012 ) and focused attention ( p = 0.0019 ) . Conclusions : Personalized , computerized cognitive training appears to be more effective than computer games in improving cognitive performance in healthy older adults . Further studies are needed to evaluate the ecological validity of these findings
11,053
23,494,446
Higher sulfonylurea doses did not reduce HbA1c more than lower doses . Conclusions /interpretationSulfonylurea monotherapy lowered HbA1c level more than previously reported , and we found no evidence that increasing sulfonylurea doses result ed in lower HbA1c . HbA1c is a surrogate endpoint , and we were unable to examine long-term endpoints in these predominately short-term trials , but sulfonylureas appear to be associated with an increased risk of hypoglycaemic events
Aims /hypothesisSulfonylureas are widely prescribed glucose-lowering medications for diabetes , but the extent to which they improve glycaemia is poorly documented . This systematic review evaluates how sulfonylurea treatment affects glycaemic control .
OBJECTIVE To examine whether and how improvement of glycemic control by long-term insulin therapy decreases endothelial activation as measured by serum levels of the soluble adhesion molecules sE-selectin and vascular cell adhesion molecule ( VCAM-1 ) and whether the drug used to lower blood glucose in addition to insulin influences such a response . RESEARCH DESIGN AND METHODS Circulating adhesion molecules were measured before and after 3 and 12 months of therapy in 81 patients with type 2 diabetes and 41 subjects without diabetes . The patients were treated with bedtime administration of NPH insulin combined with either glibenclamide ( n = 19 ) , metformin ( n = 17 ) , glibenclamide and metformin ( n = 17 ) , or morning administration of NPH insulin ( n = 23 ) . RESULTS Before insulin therapy , serum sE-selectin level was 71 % higher in the patients with type 2 diabetes ( 77 + /- 4 ng/ml ) than in the normal subjects ( 45 + /- 3 ng/ml , P < 0.001 ) , whereas levels of sVCAM-1 were comparable ( 420 + /- 25 vs. 400 + /- 11 ng/ml , respectively ) . Glycemic control in all patients improved as judged from a decrease in HbA1c from 9.7 + /- 0.2 to 7.6 + /- 0.1 % ( P < 0.001 ) . sE-selectin decreased to 67 + /- 4 ng/ml by 3 months ( P < 0.001 vs. 0 months ) and then remained unchanged until 12 months ( 70 + /- 4 ng/ml P < 0.001 vs 0 months ) . sVCnM-1 levels at 12 months was similar to those at 0 months ( 416 + /- 25 ng/ml ) . The change in glycemic control , measured by HbA1c , but not in other parameters , was correlated with the change of sE-selectin ( r = 0.41 , P < 0.001 ) within the patients with type 2 diabetes . The decreases in sE-selectin were not different between the various treatment groups . CONCLUSIONS We conclude that improvement in glycemic control by administration of insulin alone or insulin combined with either glibenclamide , metformin , or both agents induces a sustained decrease in sE-selectin , the magnitude of which seems to be dependent on the degree of improvement in glycemia . These data suggest that sE-selectin might provide a marker of effects of treatment of chronic hyperglycemia on endothelial activation OBJECTIVE —The efficacy and safety of adding liraglutide ( a glucagon-like peptide-1 receptor agonist ) to metformin were compared with addition of placebo or glimepiride to metformin in subjects previously treated with oral antidiabetes ( OAD ) therapy . RESEARCH DESIGN AND METHODS —In this 26-week , double-blind , double-dummy , placebo- and active-controlled , parallel-group trial , 1,091 subjects were r and omly assigned ( 2:2:2:1:2 ) to once-daily liraglutide ( either 0.6 , 1.2 , or 1.8 mg/day injected subcutaneously ) , to placebo , or to glimepiride ( 4 mg once daily ) . All treatments were in combination therapy with metformin ( 1 g twice daily ) . Enrolled subjects ( aged 25–79 years ) had type 2 diabetes , A1C of 7–11 % ( previous OAD monotherapy for ≥3 months ) or 7–10 % ( previous OAD combination therapy for ≥3 months ) , and BMI ≤40 kg/m2 . RESULTS —A1C values were significantly reduced in all liraglutide groups versus the placebo group ( P < 0.0001 ) with mean decreases of 1.0 % for 1.8 mg liraglutide , 1.2 mg liraglutide , and glimepiride and 0.7 % for 0.6 mg liraglutide and an increase of 0.1 % for placebo . Body weight decreased in all liraglutide groups ( 1.8–2.8 kg ) compared with an increase in the glimepiride group ( 1.0 kg ; P < 0.0001 ) . The incidence of minor hypoglycemia with liraglutide ( ∼3 % ) was comparable to that with placebo but less than that with glimepiride ( 17 % ; P < 0.001 ) . Nausea was reported by 11–19 % of the liraglutide-treated subjects versus 3–4 % in the placebo and glimepiride groups . The incidence of nausea declined over time . CONCLUSIONS —In subjects with type 2 diabetes , once-daily liraglutide induced similar glycemic control , reduced body weight , and lowered the occurrence of hypoglycemia compared with glimepiride , when both had background therapy of metformin We conducted a double-blind crossover study to determine which patient characteristics best predict a beneficial response to combined insulin-glyburide therapy . Glyburide ( 15 mg/day ) or placebo was added to the treatment regimen of 31 insulin-treated type II ( non-insulin-dependent ) diabetic subjects . During glyburide therapy , there was a significant improvement in glycemic control with a reduction in glycosylated hemoglobin from 9.9 ± 1.3 to 9.1 ± 1.3 % ( P < .001 ) . Patients who responded had higher fasting C-peptide levels ( P < .001 ) and shorter duration s of insulin therapy ( P < .01 ) than those who did not respond . Glyburide withdrawal was associated with a > expected deterioration in glycemic control . Patients on insulin therapy for > 8 yr are unlikely to benefit significantly from the addition of glyburide to their treatment regimen To assess the effect of Glipizide on glycaemic control and peripheral insulin sensitivity , 9 type 1 ( insulin dependent ) diabetic patients with normal BMI , mean age 42.1 + /- 11.0 years , diabetes duration 16.3 + /- 9.2 years were studied . They were treated by continuous subcutaneous insulin infusion for a mean duration of 32.2 + /- 11.0 months , they were in good glycaemic control ( mean HbA1 7.9 + /- 1.2 % , upper limit of normal value 7.5 % ) . In a double blind r and omized control study they were successively allocated for a three month period to 15 mg of Glipizide daily or a Placebo . At the end of each period the following parameters were recorded : HbA1 , mean plasma glucose levels , daily insulin dosage : basal rate and pre pr and ial bolus , peripheral insulin sensitivity assessed by euglycaemic hyperinsulinic clamp technique , the addition of Glipizide did not induce any statistically significant modification of HbA1 , glycaemic values , and daily insulin dosage : basal rate 18.2 + /- 8.7 vs. 17.9 + /- 7.3 IU/24 hours and pre pr and ial bolus 18.6 + /- 7.0 vs 17.6 + /- 6.3 IU/24 hours . During the glucose clamp , glucose uptake was similar under Glipizide or Placebo with the 3 levels of insulin infused . These results suggest that in type 1 diabetic patients the addition of Glipizide to insulin therapy does not alter glycaemic control and peripheral insulin sensitivity BACKGROUND Sulfonylurea drugs have been the only oral therapy available for patients with non-insulin-dependent diabetes mellitus ( NIDDM ) in the United States . Recently , however , metformin has been approved for the treatment of NIDDM . METHODS We performed two large , r and omized , parallel-group , double-blind , controlled studies in which metformin or another treatment was given for 29 weeks to moderately obese patients with NIDDM whose diabetes was inadequately controlled by diet ( protocol 1 : metformin vs. placebo ; 289 patients ) , or diet plus glyburide ( protocol 2 : metformin and glyburide vs. metformin vs. glyburide ; 632 patients ) . To determine efficacy we measured plasma glucose ( while the patients were fasting and after the oral administration of glucose ) , lactate , lipids , insulin , and glycosylated hemoglobin before , during , and at the end of the study . RESULTS In protocol 1 , at the end of the study the 143 patients in the metformin group , as compared with the 146 patients in the placebo group , had lower mean ( + /- SE ) fasting plasma glucose concentrations ( 189 + /- 5 vs. 244 + /- 6 mg per deciliter [ 10.6 + /- 0.3 vs. 13.7 + /- 0.3 mmol per liter ] , P < 0.001 ) and glycosylated hemoglobin values ( 7.1 + /- 0.1 percent vs. 8.6 + /- 0.2 percent , P < 0.001 ) . In protocol 2 , the 213 patients given metformin and glyburide , as compared with the 210 patients treated with glyburide alone , had lower mean fasting plasma glucose concentrations ( 187 + /- 4 vs. 261 + /- 4 mg per deciliter [ 10.5 + /- 0.2 vs. 14.6 + /- 0.2 mmol per liter ] , P < 0.001 ) and glycosylated hemoglobin values ( 7.1 + /- 0.1 percent vs. 8.7 + /- 0.1 percent , P < 0.001 ) . The effect of metformin alone was similar to that of glyburide alone . Eighteen percent of the patients given metformin and glyburide had symptoms compatible with hypoglycemia , as compared with 3 percent in the glyburide group and 2 percent in the metformin group . In both protocol s the patients given metformin had statistically significant decreases in plasma total and low-density lipoprotein cholesterol and triglyceride concentrations , whereas the values in the respective control groups did not change . There were no significant changes in fasting plasma lactate concentrations in any of the groups . CONCLUSIONS Metformin monotherapy and combination therapy with metformin and sulfonylurea are well tolerated and improve glycemic control and lipid concentrations in patients with NIDDM whose diabetes is poorly controlled with diet or sulfonylurea therapy alone Abstract Objective : To determine the relation between systolic blood pressure over time and the risk of macrovascular or microvascular complications in patients with type 2 diabetes . Design : Prospect i ve observational study . Setting : 23 hospital based clinics in Engl and , Scotl and , and Northern Irel and . Participants : 4801 white , Asian Indian , and Afro-Caribbean UKPDS patients , whether r and omised or not to treatment , were included in analyses of incidence ; of these , 3642 were included in analyses of relative risk . Outcome measures : Primary predefined aggregate clinical outcomes : any complications or deaths related to diabetes and all cause mortality . Secondary aggregate outcomes : myocardial infa rct ion , stroke , lower extremity amputation ( including death from peripheral vascular disease ) , and microvascular disease ( predominantly retinal photocoagulation ) . Single end points : non-fatal heart failure and cataract extraction . Risk reduction associated with a 10 mm Hg decrease in up date d mean systolic blood pressure adjusted for specific confounders Results : The incidence of clinical complications was significantly associated with systolic blood pressure , except for cataract extraction . Each 10 mm Hg decrease in up date d mean systolic blood pressure was associated with reductions in risk of 12 % for any complication related to diabetes ( 95 % confidence interval 10 % to 14 % , P<0.0001 ) , 15 % for deaths related to diabetes ( 12 % to 18 % , P<0.0001 ) , 11 % for myocardial infa rct ion ( 7 % to 14 % , P<0.0001 ) , and 13 % for microvascular complications ( 10 % to 16 % , P<0.0001 ) . No threshold of risk was observed for any end point . Conclusions : In patients with type 2 diabetes the risk of diabetic complications was strongly associated with raised blood pressure . Any reduction in blood pressure is likely to reduce the risk of complications , with the lowest risk being in those with systolic blood pressure less than 120 mm Hg OBJECTIVE To assess the efficacy , safety , and dose-response relationship of glimepiride in patients with NIDDM . RESEARCH DESIGN AND METHODS After a 21-day placebo washout period , 304 patients were r and omized to receive either placebo or glimepiride , 1 , 4 , or 8 mg once daily . Fasting plasma glucose ( FPG ) , 2-h postpr and ial glucose ( PPG ) , and HbA1c were measured at predetermined intervals during the washout period and the 14-week study . Adverse events were tabulated . RESULTS At each patient visit , reduction from baseline FPG was greater in each glimepiride group than in the placebo group ( P < 0.001 ) . Changes from baseline to endpoint after 1 , 4 , and 8 mg glimepiride exceeded those after placebo ( P < 0.001 ) by 2.4 , 3.9 , and 4.1 mmol/l , respectively , for FPG ; by 1.2 , 1.8 , and 1.9 percentage points , respectively , for HbA1c ; and by 3.5 , 5.1 , and 5.2 mmol/l , respectively , for 2-h PPG . Greater reductions in these parameters were observed with 8 and 4 mg than with 1 mg ( P < 0.05 ) , indicating a dose-response relationship . When patients with baseline HbA1c levels > or = 8 % were assessed , more patients who received 8 mg glimepiride had HbA1c values < 8 % at endpoint compared with patients receiving 4 mg . Glimepiride had a favorable safety profile . CONCLUSIONS Glimepiride in 1- , 4- , or 8-mg doses was effective and well tolerated . Although the 4- and 8-mg once-daily doses were significantly more potent than the 1-mg dose , all three doses yielded clinical improvement . Because the 8-mg dose controlled HbA1c values in a greater number of patients with high baseline HbA1c levels than did the 4-mg dose , this higher dose might be beneficial for patients who are difficult to treat AIMS To study whether changes in endogenous insulin secretion at the same glycaemic control affect the plasma concentrations of lipoproteins in patients with Type 2 diabetes mellitus . METHODS Fifteen patients , age 59+/-2 years ( mean + /- SEM ) , body weight 86.3+/-3.0 kg , body mass index 29.6+/-0.9 kg/m2 were treated with sulphonylurea and insulin in combination or with insulin alone in a r and omized , double-blind , crossover study . All patients were treated with a multiple daily injection regimen with the addition of glibenclamide 10.5 mg daily or placebo tablets . RESULTS During combination therapy , the dose of insulin was 25 % less ( P < 0.002 ) and there was a 29 % increase in plasma C-peptide concentration ( P = 0.01 ) . Plasma levels of free insulin were not changed . Plasma levels of sex hormone-binding globulin ( SHBG ) and insulin-like growth factor-binding protein (IGFBP)-1 were lowered . There were no differences in the 24-h blood glucose profiles or HbA1c ( 6.0+/-0.2 vs. 6.3+/-0.2 % ; P = 0.16 ) . Body weight was similar . There was a significant decrease in plasma LDL cholesterol ( 3.04+/-0.24 vs. 3.41+/-0.21 mmol/l ; P = 0.04 ) , apolipoprotein A1 and of lipoprotein(a ) but an increase in VLDL-triglycerides ( 1.36+/-0.31 vs. 0.96+/-0.16 mmol/l ; P = 0.02 ) during combination therapy . The ratio between LDL cholesterol and apolipoprotein B concentrations was significantly lower during combination therapy ( P < 0.01 ) . CONCLUSIONS Combination therapy with insulin and sulphonylureas increases portal insulin supply and thereby alters liver lipoprotein metabolism when compared with insulin therapy alone OBJECTIVE : To assess the ability of a combination of insulin and an oral hypoglycemic agent ( glyburide ) to improve the overall glycemic control in a population of patients with type I diabetes . DESIGN : R and omized , placebo-controlled , double-blind trial . SETTING : Community-based , university-affiliated , family medicine group . PATIENTS : Men and women between 18 and 68 years of age with type I diabetes . INTERVENTIONS : Subjects were observed and titrated on an insulin only regimen for 12 weeks ( phase I ) . Subjects were then r and omized to receive either placebo or glyburide 10 mg/d for an additional 12 weeks ( phase II ) . MAIN OUTCOME MEASURES : Glucose measurements were taken at breakfast , lunch , supper , and bedtime . Each patient also was followed sequentially for serum lipids , glycosylated hemoglobin , ( Hb A1c ) and daily insulin utilization . RESULTS : Average fasting blood glucose ( FBG ) measurements were significantly lower in the glyburide-treated group during phase II ( 9.22 ± 0.55 mmol/L ) compared with baseline ( 10.27 ± 0.93 mmol/L ) and phase I ( 10.41 ± 0.55 mmol/L ) . A decrease in the average Hb A1c concentration in the glyburide group was evident by week 4 and was sustained for the duration of the study . The average daily insulin dose rose significantly in the glyburide but not the placebo group compared with baseline . Total cholesterol , triglycerides , and low-density lipoprotein cholesterol did not change significantly in either group over the course of the study . High-density lipoprotein cholesterol increased significantly over baseline in the glyburide group during phase II . Several patients experienced dramatic improvements in glycemic parameters after the addition of glyburide to their insulin regimens . CONCLUSIONS : Improvements were observed in the FBG and Hb A1c measurements of this heterogeneous population of patients with type I diabetes after the addition of glyburide to their insulin regimens . The study failed to find consistent trends in glycemic control when evaluating mean changes in FBG measurements Summary The effects of combined insulin and sulfonylurea therapy on glycaemic control and B-cell function was studied in 15 Type 2 ( non-insulin-dependent ) diabetic patients who had failed on treatment with oral hypoglycaemic agents . The patients were first treated with insulin alone for four months . Five patients were given two daily insulin doses and ten patients one dose . During insulin treatment the fasting plasma glucose fell from 14.5±0.8 to 8.8±0.4 mmol/l and the HbA1 concentration from 12.6±0.4 to 9.2±0.2 % . This improvement of glycaemic control was associated with a suppression of basal ( from 0.31±0.04 to 0.10±0.02 nmol/l ) and glucagon-stimulated ( from 0.50±0.08 to 0.19±0.04 nmol/l ) C-peptide concentrations . Four months after starting insulin therapy the patients were r and omised to a four-month double-blind cross-over treatment with insulin combined with either 15 mg glibenclamide per day or with placebo . Addition of glibenclamide to insulin result ed in a further reduction of the fasting plasma glucose ( 7.9±0.5 mmol/l ) and HbA1 ( 8.3±0.2 % ) concentration whereas the basal ( 0.21±0.03 nmol/l ) and glucagon-stimulated C-peptide concentrations ( 0.34±0.06 nmol/l ) increased again . Addition of placebo to insulin had no effect . The daily insulin dose could be reduced by 25 % after addition of glibenclamide to insulin , while it remained unchanged when insulin was combined with placebo . The fasting free insulin concentration did not differ between the glibenclamide and placebo periods ( 28±6 vs 30±5 mmol/l ) . The fasting free insulin concentration correlated , however , positively with the insulin dose ( r=0.76 , p<0.01 ) indicating that the insulin dose was the main determinant of the free insulin concentration . In contrast , the basal C-peptide concentration was higher during the insulin plus glibenclamide than during the insulin plus placebo period ( 0.21±0.03 vs 0.16±0.03 nmol/l ; p<0.05 ) . Addition of glibenclamide to insulin therapy increased the treatment cost by 30–50 % , was associated with increased frequency of mild hypoglycaemic reactions and with a slight , but significant fall in HDL cholesterol concentration ( from 1.40±0.07 to 1.29±0.06 ; p<0.05 ) compared with insulin plus placebo . We conclude that in Type 2 diabetic patients , who have failed on treatment with oral hypoglycaemic agents , the combination of insulin and glibenclamide result ed in slightly improved glycaemic control and allowed reduction of the insulin dose . The price for this improvement was higher treatment costs , more ( mild ) hypoglycaemic reactions and a marginal fall in the HDL cholesterol concentration . Whether the same effect could have been achieved with divided insulin doses in all patients is not known Microvascular and neuropathic complications of diabetes mellitus can be significantly decreased by long-term , near-normoglycemic regulation in patients with insulin-dependent diabetes mellitus . Prevention or delay of onset of hyperglycemia in non-insulin-dependent diabetes mellitus ( NIDDM ) patients should reduce morbidity and mortality from these complications . NIDDM can be nearly normoglycemic when diagnosed by screening before its symptomatic stage or when clinical ly hyperglycemic NIDDM goes into remission . One potential strategy to delay the onset of hyperglycemia in individuals at high risk is chronic low-dose sulfonylurea therapy . Thirty black NIDDM subjects who recently had developed near-normoglycemia were followed with no treatment or were r and omly assigned to a 3-year , double-blind glipizide or placebo treatment . Baseline and follow-up parameters included fasting plasma glucose ( FPG ) , HbA1c , plasma insulin , and glucose responses to an oral glucose tolerance test and insulin action , as determined by the euglycemic insulin clamp . Baseline FPG and HbA1c for all three groups were 107 mg/dl and 4.7 % , respectively . Relapse to hyperglycemia was defined as an FPG level ≥ 140 mg/dl on several consecutive visits or an FPG level ≥ 140 mg/dl and symptoms of hyperglycemia . During the course of the treatment and follow-up , hyperglycemia occurred in 6 of 10 subjects in the no treatment group , 6 of 10 in the placebo group , and 2 of 10 in the glipizide treatment group . Prolongation of near-normoglycemia was significantly ( P < 0.05 ) increased by low-dose ( 2.5 mg/day ) glipizide compared with placebo treatment . Low-dose sulfonylurea therapy delays the onset of hyperglycemia in NIDDM subjects in remission and may be a useful method to delay the onset of NIDDM in high-risk individuals OBJECTIVE To compare the efficacy and safety of two daily doses of the new sulfonylurea , glimepiride ( Amaryl ) , each as a once-daily dose or in two divided doses , in patients with NIDDM . RESEARCH DESIGN AND METHODS Of the previously treated NIDDM patients , 416 entered this multicenter r and omized double-blind placebo-controlled fixed-dose study . After a 3-week placebo washout , patients received a 14-week course of placebo or glimepiride 8 mg q.d . , 4 mg b.i.d . , 16 mg q.d . , or 8 mg b.i.d . RESULTS Fasting plasma glucose ( FPG ) and HbA1c values were similar at baseline in all treatment groups . The placebo group 's FPG value increased from 13.0 mmol/l at baseline to 14.5 mmol/l at the last evaluation endpoint ( P ≤ 0.001 ) . In contrast , FPG values in the four glimepiride groups decreased from a range of 12.4–12.9 mmol/l at baseline to a range of 8.6–9.8 mmol/l at endpoint ( P ≤ 0.001 , within-group change from baseline ; P ≤ 0.001 , between-group change [ vs. placebo ] from baseline ) . Two-hour postpr and ial plasma glucose ( PPG ) findings were consistent with FPG findings . In the placebo group , the HbA1c value increased from 7.7 % at baseline to 9.7 % at endpoint ( P ≤ 0.001 ) , whereas HbA1c values for the glimepiride groups were 7.9–8.1 % at baseline and 7.4–7.6 % at endpoint ( P ≤ 0.001 , within-group change from baseline ; P ≤ 0.001 , between-group change from baseline ) . There were no meaningful differences in glycemic variables between daily doses of 8 and 16 mg or between once- and twice-daily dosing . Adverse events and laboratory data demonstrate that glimepiride has a favorable safety profile . CONCLUSIONS Glimepiride is an effective and well-tolerated oral glucose-lowering agent . The results of this study demonstrate maximum effectiveness can be achieved with 8 mg q.d . of glimepiride in NIDDM subjects AIM To investigate the onset of treatment effects over time observed for liraglutide in combination with oral antidiabetic drugs ( OADs ) . METHODS This analysis included patients from three phase 3 , 26-week , r and omised , double-blind , parallel-group trials . Prior to r and omisation , patients underwent a run-in and titration period with metformin ( Liraglutide Effect and Action in Diabetes-2 , LEAD-2 ) , glimepiride ( LEAD-1 ) or metformin plus rosiglitazone ( LEAD-4 ) . Patients were then r and omised to receive liraglutide ( 0.6 , 1.2 or 1.8 mg once-daily ) , active comparator and /or placebo . For this analysis , only the 1.2 mg and 1.8 mg liraglutide doses were included . Outcome measures included change in HbA(1c ) , fasting plasma glucose ( FPG ) , weight and systolic blood pressure ( SBP ) . The safety profile was also investigated . RESULTS Significant reductions in HbA(1c ) were observed within 8 weeks of treatment with liraglutide plus OADs ( p < 0.0001 ) and maintained until week 26 . Furthermore , liraglutide plus OADs led to significant reductions in FPG within 2 weeks ( p < 0.0001 ) and sustained over 26 weeks . Adding liraglutide to metformin or metformin plus rosiglitazone also led to early reductions and maintained reductions in body weight ( within 8 weeks , p < 0.0001 ) ; however , liraglutide treatment plus glimepiride was weight neutral . Rapid reductions in SBP were observed for liraglutide plus OADs ( within 2 weeks , p < 0.05 - 0.001 ) and maintained for 26 weeks . Some patients experienced nausea , which for the majority it diminished within 2 weeks . CONCLUSION Liraglutide treatment combined with OADs led to rapid improvements in FPG and SBP . Early reductions in HbA(1c ) and body weight were also observed . Adding liraglutide to OADs early on may therefore be a good treatment option for patients with type 2 diabetes Combined insulin and sulfonylurea therapy for type 2 diabetes may improve the effectiveness of a single injection of insulin , thereby postponing the need for multiple injections . This concept was tested in 21 obese subjects imperfectly controlled by 20 mg of glyburide daily in a double masked , placebo-controlled , parallel design , 16-week protocol . Premixed 70 % NPH/30 % Regular insulin was taken before supper , and the dosage was adjusted weekly by an algorithm seeking nearly normal fasting glycemia . Eleven subjects using insulin plus 10 mg glyburide before breakfast had lower mean fasting glucose at 10 - 16 weeks than 10 subjects using insulin with placebo ( mean + /- SEM ; 5.9 + /- 0.3 versus 7.5 + /- 0.7 mmol/L ; p less than 0.05 ) , and had a greater decrement of glycosylated hemoglobin from baseline values ( 1.3 + /- 0.1 versus 0.8 + /- 0.2 % A1 , p less than 0.05 ) . After 16 weeks the combined therapy group used half as much insulin as the insulin-only group ( 50 + /- 5 versus 101 + /- 13 units/d ; p less than 0.01 ) . Fasting serum free insulin values increased 58 % from baseline after insulin therapy in the insulin-only group ( p less than 0.05 ) but did not increase with combined therapy . Weight gain was similar in the two groups . These data support this form of combined therapy as one option for treating obese persons with type 2 diabetes no longer responsive to oral therapy alone OBJECTIVE This study tested a simple algorithm for beginning insulin for obese patients with type 2 diabetes after sulfonylurea failure , comparing suppertime 70/30 insulin plus continued glimepiride with insulin alone . RESEARCH DESIGN AND METHODS This was a multicenter ambulatory r and omized double-masked parallel comparison . There were 208 subjects with secondary failure to sulfonylureas who took glimepiride titrated to 8 mg b.i.d . for 8 weeks ; 145 subjects with fasting plasma glucose ( FPG ) 180–300 mg/dl ( 10–16.7 mmol/1 ) on this treatment were r and omized to placebo plus insulin ( PI ) or glimepiride plus insulin ( GI ) for 24 weeks . A dosage of 70/30 insulin before supper was titrated , seeking fasting capillary blood glucose ( FBG ) 120 mg/dl ( 6.7 mmol/1 ) , equivalent to FPG 140 mg/dl ( 7.8 mmol/1 ) . Outcome measures included FPG , HbA1c , insulin dosage , weight , serum insulin and lipids , and adverse events . RESULTS FPG and HbA1c were equivalent at baseline : 261 vs. 250 mg/dl ( 14.5 vs. 13.9 mmol/1 ) , and 9.9 vs. 9.7 % . At 24 weeks , the FPG target was achieved in both groups ( 136 vs. 138 mg/dl , 7.6 vs. 7.6 mmol/1 ) , and HbA1c values were equal ( 7.7 vs. 7.6 % ) . However , with GI , control improved faster and fewer subjects dropped out ( 3 vs. 15 % , P < 0.01 ) , and less insulin was needed ( 49 vs. 78 U/d , P < 0.001 ) . The outcomes were alike in other respects . No subject had severe hypoglycemia . CONCLUSIONS Injection of 70/30 insulin before supper safely restored glycemic control of type 2 diabetes not controlled by glimepiride alone . Control was restored more rapidly and with less injected insulin when glimepiride was continued In 20 patients with non-insulin-dependent diabetes mellitus ( NIDDM ) and secondary failure to sulfonylurea , a double-blind r and omized study was performed comparing two regimes : insulin plus placebo ( IP ) and insulin plus glyburide ( IG ) . The protocol included two hospitalization periods ( days 1–18 and 78–85 ) and follow-up at the outpatient clinic for 325 days . The metabolic control was kept as tight as possible . The subjects underwent normoglycemic clamp studies and meal tests with determination of insulin , C-peptide , glucagon , somatostatin , and gastric inhibitory polypeptide in plasma . On IG , they demonstrated marked and long-lasting improvement of metabolic control : HbA1c decreased from 11.1 ± 0.3 % on day 3 to 8.3 ± 0.4 % ( P < .001 ) on day 78 and 9.1 ± 0.5 % ( P < .001 ) on day 325 . In subjects on IP , the corresponding values were 10.3 ± 0.5 , 8.4 ± 0.4 ( P < .001 ) , and 8.9 ± 0.5 % ( P < .05 ) . Body weight increased by 6.0 ± 1.5 kg ( P < .005 ) on IG and 2.9 ± 2.1 kg ( NS ) on IP . The daily insulin requirement decreased on IG from 62.5 ± 12.9 U/day on day 7 to 33.5 ± 8.8 U/ day on day 83 and 34.6 ± 8.9 U/day on day 325 . On IP the insulin requirement was almost constant : 62.0 ± 10.7 U/day on day 7 , 55.5 ± 7.7 U/day on day 83 , and 54.7 ± 7.9 U/day on day 325 . Insulin sensitivity measured with the hyperinsulinemic clamp ( plasma insulin ≈130 μU/ml ) was similar on IP and IG at the initiation of the study and was unchanged on days 18 and 85 . A key observation of this study , although the mechanism is unclear , is that isoglycemic-meal-related insulin requirement was diminished by insulin treatment , indicating improvement of meal-related insulin sensitivity . Glyburide increased basal and meal- but not glucagon-stimulated insulin and C-peptide levels , and also augmented the effect of meals on somatostatin release . We conclude that in NIDDM , IG regime promptly and continuously decreased insulin requirement and improved metabolic control . This effect is , at least during the first 3 mo , mainly due to enhanced insulin secretion . IG and IP treatment had no effect on insulin sensitivity during hyperinsulinemic-normoglycemic clamp , whereas meal-related insulin sensitivity was augmented BACKGROUND Activation of free fatty acid receptor 1 ( FFAR1 ; also known as G-protein-coupled receptor 40 ) by fatty acids stimulated glucose-dependent β-cell insulin secretion in pre clinical models . We aim ed to assess whether selective pharmacological activation of this receptor by TAK-875 in patients with type 2 diabetes mellitus improved glycaemic control without hypoglycaemia risk . METHODS We undertook a phase 2 , r and omised , double-blind , and placebo-controlled and active-comparator-controlled trial in out patients with type 2 diabetes who had not responded to diet or metformin treatment . Patients were r and omly assigned equally to receive placebo , TAK-875 ( 6·25 , 25 , 50 , 100 , or 200 mg ) , or glimepiride ( 4 mg ) once daily for 12 weeks . Patients and investigators were masked to treatment assignment . The primary outcome was change in haemoglobin A(1c ) ( HbA(1c ) ) from baseline . Analysis included all patients r and omly assigned to treatment groups who received at least one dose of double-blind study drug . The trial is registered at Clinical Trials.gov , NCT01007097 . FINDINGS 426 patients were r and omly assigned to TAK-875 ( n=303 ) , placebo ( n=61 ) , and glimepiride ( n=62 ) . At week 12 , significant least-squares mean reductions in HbA(1c ) from baseline occurred in all TAK-875 ( ranging from -1·12 % [ SE 0·113 ] with 50 mg to -0·65 % [ 0·114 ] with 6·25 mg ) and glimepiride ( -1·05 % [ SE 0·111 ] ) groups versus placebo ( -0·13 % [ SE 0·115 ] ; p value range 0·001 to < 0·0001 ) . Treatment-emergent hypoglycaemic events were similar in the TAK-875 and placebo groups ( 2 % [ n=7 , all TAK-875 groups ] vs 3 % [ n=2 ] ) ; significantly higher rates were reported in the glimepiride group ( 19 % [ n=12 ] ; p value range 0·010 - 0·002 vs all TAK-875 groups ) . Incidence of treatment-emergent adverse events was similar in the TAK-875 overall ( 49 % ; n=147 , all TAK-875 groups ) and placebo groups ( 48 % , n=29 ) and was lower than in the glimepiride group ( 61 % , n=38 ) . INTERPRETATION TAK-875 significantly improved glycaemic control in patients with type 2 diabetes with minimum risk of hypoglycaemia . The results show that activation of FFAR1 is a viable therapeutic target for treatment of type 2 diabetes . FUNDING Takeda Global Research and Development OBJECTIVE To determine if the combination of troglitazone ( a peroxisome proliferator-activated receptor-γ activator ) and sulfonylurea will provide efficacy not attainable by either medication alone . RESEARCH DESIGN AND METHODS There were 552 patients inadequately controlled on maximum doses of sulfonylurea who participated in a 52-week r and omized active-controlled multicenter study . Patients were r and omized to micronized glyburide 12 mg q.d . ( G12 ) ; troglitazone monotherapy 200 , 400 , or 600 mg q.d . ( T200 , T400 , T600 ) ; or combined troglitazone and glyburide q.d . ( T200/G12 , T400/G12 , T600/G12 ) . Efficacy measures included HbA1c , fasting serum glucose ( FSG ) , insulin , and C-peptide . Effects on lipids and safety were also assessed . RESULTS Patients on T600/G12 had significantly lower mean ( ± SEM ) FSG ( 9.3 ± 0.4 mmol/l ; 167.4 ± 6.6 mg/dl ) compared with control subjects ( 13.7 ± 0.4 mmol/l ; 246.5 ± 6.8 mg/dl ; P < 0.0001 ) and significantly lower mean HbA1c ( 7.79 ± 0.2 vs. 10.58 ± 0.18 % , P < 0.0001 ) . Significant dose-related decreases were also seen with T200/G12 and T400/G12 . Among patients on T600/G12 , 60 % achieved HbA1c ≤8 % , 42 % achieved HbA1c ≤7 % , and 40 % achieved FSG ≤7.8 mmol/l ( 140 mg/dl ) . Fasting insulin and C-peptide decreased with all treatments . Overall , triglycerides and free fatty acids decreased , whereas HDL cholesterol increased . LDL cholesterol increased slightly , with no change in apolipoprotein B. Adverse events were similar across treatments . Hypoglycemia occurred in 3 % of T600/G 12 patients compared with < 1 % on G12 or troglitazone monotherapy CONCLUSIONS Patients with type 2 diabetes inadequately controlled on sulfonylurea can be effectively managed with a combination of troglitazone and sulfonylurea that is safe , well tolerated , and represents a new approach to achieving the glycemic targets recommended by the American Diabetes Association OBJECTIVE To determine whether insulin-requiring patients with non-insulin-dependent diabetes mellitus ( NIDDM ) and good glycemic control would benefit in weight control , serum lipid concentrations , or blood pressure from a reduction in exogenous insulin treatment . METHODS Eighteen patients with well-controlled NIDDM who required insulin therapy were entered into a r and omized , placebo-controlled , double-blind , crossover study of the addition for 12 weeks of treatment with a second-generation sulfonylurea agent ( micronized glyburide ) . RESULTS The mean fasting plasma glucose at entry was 7.00 + /- 0.22 mmol/L and at the end of the 12-week treatment phase was 7.67 + /- 0.39 mmol/L with placebo and 7.28 + /- 0.44 mmol/L with active drug . Hemoglobin A(1c ) was unchanged during the study ( 7.5 + /- 0.2 % at entry , 7.5 + /- 0.3 % with placebo , and 7.4 + /- 0.3 % with active drug ) . Addition of the orally administered agent result ed in a 29 % decrease in exogenous insulin requirements and a 37 % increase in 24-hour urinary C-peptide excretion . Patients had no change in weight after 12 weeks of either placebo or active drug . Plasma cholesterol levels declined slightly during the study , but they did not differ significantly during drug and placebo treatment . Blood pressure was unchanged in both the subjects with and without hypertension . CONCLUSION In patients with NIDDM and good glycemic control with insulin treatment , a glyburide-related increase in endogenous insulin secretion caused a proportionate decrease in exogenous insulin requirements . With continued good glycemic control , however , the orally administered agent showed no additional benefit on weight , blood pressure , plasma triglycerides , or low-density lipoprotein or high-density lipoprotein cholesterol AIM The effect on body composition of liraglutide , a once-daily human glucagon-like peptide-1 analogue , as monotherapy or added to metformin was examined in patients with type 2 diabetes ( T2D ) . METHODS These were r and omized , double-blind , parallel-group trials of 26 [ Liraglutide Effect and Action in Diabetes-2 ( LEAD-2 ) ] and 52 weeks ( LEAD-3 ) . Patients with T2D , aged 18 - 80 years , body mass index ( BMI ) < or = 40 kg/m(2 ) ( LEAD-2 ) , < or = 45 kg/m(2 ) ( LEAD-3 ) and HbA1c 7.0 - 11.0 % were included . Patients were r and omized to liraglutide 1.8 , 1.2 or 0.6 mg/day , placebo or glimepiride 4 mg/day , all combined with metformin 1.5 - 2 g/day in LEAD-2 and to liraglutide 1.8 , 1.2 or glimepiride 8 mg/day in LEAD-3 . LEAD-2/3 : total lean body tissue , fat tissue and fat percentage were measured . LEAD-2 : adipose tissue area and hepatic steatosis were assessed . RESULTS LEAD-2 : fat percentage with liraglutide 1.2 and 1.8 mg/metformin was significantly reduced vs. glimepiride/metformin ( p < 0.05 ) but not vs. placebo . Visceral and subcutaneous adipose tissue areas were reduced from baseline in all liraglutide/metformin arms . Except with liraglutide 0.6 mg/metformin , reductions were significantly different vs. changes seen with glimepiride ( p < 0.05 ) but not with placebo . Liver-to-spleen attenuation ratio increased with liraglutide 1.8 mg/metformin possibly indicating reduced hepatic steatosis . LEAD-3 : reductions in fat mass and fat percentage with liraglutide monotherapy were significantly different vs. increases with glimepiride ( p < 0.01 ) . CONCLUSION Liraglutide ( monotherapy or added to metformin ) significantly reduced fat mass and fat percentage vs. glimepiride in patients with T2D BACKGROUND The efficacy of thiazolidinediones , as compared with other oral glucose-lowering medications , in maintaining long-term glycemic control in type 2 diabetes is not known . METHODS We evaluated rosiglitazone , metformin , and glyburide as initial treatment for recently diagnosed type 2 diabetes in a double-blind , r and omized , controlled clinical trial involving 4360 patients . The patients were treated for a median of 4.0 years . The primary outcome was the time to monotherapy failure , which was defined as a confirmed level of fasting plasma glucose of more than 180 mg per deciliter ( 10.0 mmol per liter ) , for rosiglitazone , as compared with metformin or glyburide . Prespecified secondary outcomes were levels of fasting plasma glucose and glycated hemoglobin , insulin sensitivity , and beta-cell function . RESULTS Kaplan-Meier analysis showed a cumulative incidence of monotherapy failure at 5 years of 15 % with rosiglitazone , 21 % with metformin , and 34 % with glyburide . This represents a risk reduction of 32 % for rosiglitazone , as compared with metformin , and 63 % , as compared with glyburide ( P<0.001 for both comparisons ) . The difference in the durability of the treatment effect was greater between rosiglitazone and glyburide than between rosiglitazone and metformin . Glyburide was associated with a lower risk of cardiovascular events ( including congestive heart failure ) than was rosiglitazone ( P<0.05 ) , and the risk associated with metformin was similar to that with rosiglitazone . Rosiglitazone was associated with more weight gain and edema than either metformin or glyburide but with fewer gastrointestinal events than metformin and with less hypoglycemia than glyburide ( P<0.001 for all comparisons ) . CONCLUSIONS The potential risks and benefits , the profile of adverse events , and the costs of these three drugs should all be considered to help inform the choice of pharmacotherapy for patients with type 2 diabetes . ( Clinical Trials.gov number , NCT00279045 [ Clinical Trials.gov ] . ) OBJECTIVE To compare the therapeutic effects of the α-glucosidase inhibitor miglitol ( BAY m 1099 ) , the sulfonylurea glibenclamide , and placebo on parameters of metabolic control and safety in patients with NIDDM that is inadequately controlled by diet alone . RESEARCH DESIGN AND METHODS After a 4-week placebo run-in period , 201 patients in 18 centers in 4 countries were r and omized in a double-blind manner to miglitol ( 50 mg t.i.d . , followed by 100 mg t.i.d . ) , glibenclamide ( 3.5 mg q.d/b.i.d . ) , or placebo for 24 weeks . Efficacy criteria were changes from baseline of HbA1c , fasting and postpr and ial blood glucose and insulin levels , body weight , and serum triglycerides . RESULTS Efficacy was assessed in 119 patients who completed the full protocol , and the results were similar to those obtained in 186 patients who fulfilled the validity criteria for analysis . Compared with placebo , mean baseline-adjusted HbA1c decreased by 0.75 % ( P = 0.0021 ) and 1.01 % ( P = 0.0001 ) in the miglitol and glibenclamide treatment groups , respectively . Blood glucose decreased slightly in the fasting state and considerably in the postpr and ial state in both treatment groups but not in the placebo group . Fasting insulin levels increased slightly ( NS ) in all treatment groups ; however , postpr and ial insulin levels decreased with miglitol , while increasing markedly with glibenclamide ( P = 0.0001 between all treatment groups ) . Gastrointestinal side effects ( flatulence and diarrhea ) occurred mostly in the miglitol-treated patients , while some glibenclamide-treated patients had symptoms suggestive of hypoglycemia . CONCLUSIONS Miglitol monotherapy is effective and safe in NIDDM patients . Compared with glibenclamide , it reduced HbA1c less effectively and caused more gastrointestinal side effects . On the other h and , glibenclamide , unlike miglitol , tended to cause hypoglycemia , hyperinsulinemia , and weight gain , which are not desirable in patients with NIDDM Combination therapy with insulin and sulphonylurea has gained acceptance in management of subjects with Type 2 ( non-insulin-dependent ) diabetes mellitus . However , its role in management of Type 1 ( insulin-dependent ) diabetes mellitus remains controversial . In this study , the effect of combination therapy with insulin and glibenclamide on metabolic control , daily insulin dosage , and insulin sensitivity was assessed in subjects with Type 1 diabetes mellitus . Ten men with Type 1 diabetes mellitus participated in a r and omized , double-blind , crossover , clinical trial with three treatment regimens , namely ( 1 ) insulin alone , ( 2 ) insulin and placebo , ( 3 ) insulin and glibenclamide , each lasting 3 months . Combination therapy induced : ( 1 ) reduction in daily insulin dosage ; ( 2 ) more uniform blood glucose control as reflected by a lower average 24 h blood glucose level , a smaller difference between mean prepr and ial and 2 h postpr and ial blood glucose concentrations , decreased 24 h urine glucose excretion , and a decline in number of hypoglycaemic events ; ( 3 ) improved insulin sensitivity as expressed by more rapid plasma glucose disappearance rate , without a significant alteration in fasting plasma glucagon and 1h postpr and ial serum C-peptide levels ; when compared with treatment with either insulin alone or with insulin and placebo . Therefore , it is apparent that the addition of glibenclamide to insulin reduces daily insulin dosage and renders a greater uniformity to diurnal blood glucose control , most probably secondary to enhancement of insulin sensitivity Although insulin and sulfonylureas often have additive clinical effects when used in combination for type II ( non-insulin-dependent ) diabetes , these results are variable and a clinical role for this approach is not yet established . This study tests the efficacy of a specific combined regimen for a sub population of patients with a r and omized double-masked placebo-controlled crossover design and under conditions similar to those of clinical practice . Twenty subjects with limited duration ( < 15 yr ) type II diabetes who were moderately obese ( < 160 % ideal wt ) and proved imperfectly controlled on 10 mg glyburide twice daily completed two 4-mo crossover protocol s , comparing a single injection of NPH insulin in the evening plus 10 mg glyburide in the morning with insulin plus placebo . Insulin dose was adjusted by experienced endocrinologists seeking the best glycemic control consistent with safety . All subjects had glycosylated hemoglobin values ≤150 % of the control mean on combined therapy , and combined therapy was superior to insulin alone ( fasting plasma glucose 8.0 ± 0.3 vs. 11.1 ± 0.6 mM , P < .01 ; glycosylated hemoglobin 9.8 ± 0.1 vs. 10.6 ± 0.2 % , P < .01 ) . Despite greater weight gain on combined therapy , blood pressure and plasma lipid concentrations were the same on the two regimens . These results suggest this simple regimen offers another option , besides multiple injections of insulin , for patients of this kind who are unsuccessful with a sulfonylurea or a single injection of insulin alone This study investigated the long-term effect of insulin or the combination of insulin and an oral hypoglycemic compound ( glipizide ) on the skeletal muscle capillary basement membrane width in insulin-requiring diabetic patients . Seventy diabetic patients were r and omized to treatment with either insulin-placebo or insulin-glipizide ( 5 mg/d ) for 3 years . Of these , only 61 patients completed the study ; 27 patients received insulin-placebo and 34 patients received insulin-glipizide . Three skeletal muscle ( quadriceps femoris ) biopsies were performed in all patients over a 3-year period . Glycosylated hemoglobin A1 was determined every 100 + /- 20 days , including plasma glucose levels . Muscle capillary basement membrane width was quantitated by a previously described method . After approximately 16 months , glycosylated hemoglobin A1 decreased significantly in each group from its baseline ( P < 0.001 insulin-glipizide group and P < 0.025 insulin-placebo ) , although no statistically significant difference was seen between the two groups . After 3 years this decrease was statistically significant ( P < 0.001 ) only in the insulin-glipizide group . At baseline , no statistically significant difference was found in the muscle capillary basement membrane width between the two groups . In spite of the significant decrease in glycosylated hemoglobin A1 in both groups after 14 to 16 months , only muscle capillary basement membrane width in the insulin-glipizide group decreased significantly compared with baseline . Patients receiving insulin-placebo showed a gradual increase in the muscle capillary basement membrane width , which after 3 years was significantly higher than baseline ( P < 0.02 ) . Although the mechanisms by which the addition of glipizide to insulin treatment reduced the thickening of the muscle capillary basement membrane are not clearly understood , the current findings suggest that diabetic microangiopathy is not necessarily progressive and that prophylaxis may be attained OBJECTIVE To compare the long-term effect of combined treatment with insulin and glyburide versus insulin alone on serum lipid levels in non-insulin-dependent diabetic ( NIDDM ) patients with secondary failure to sulfonylurea therapy . RESEARCH DESIGN AND METHODS The study was a r and omized double-blind placebo-controlled parallel trial with a duration of 325 days . The study was conducted at a referral-based endocrinology clinic . Subjects were a sequential sample of 20 patients with NIDDM with failure to respond to glyburide treatment after at least 1 yr of adequate glucose control with this therapy . The patients were r and omized to treatment with insulin and glyburide ( IG ) or insulin and placebo ( IP ) . Insulin was given twice daily to all patients as a mixture of NPH and regular insulins in dosages aim ing at optimal glucose control . Glyburide or placebo was taken before breakfast ( 7 mg ) and dinner ( 3.5 mg ) . RESULTS Mean HbA1c decreased from 11.1 % ( range 9.8 - 12.9 % ) before insulin to 9.1 % ( range 6.8 - 11.4 % ) on day 325 ( P less than 0.001 ) in IG patients and from 10.3 % ( range 8.4 - 13.3 % ) to 9.0 % ( range 6.3 - 11.8 % ) ( P less than 0.05 ) in IP patients . In both groups , there was an increase in high-density lipoprotein cholesterol of approximately 20 % lasting throughout the study ( P less than 0.01 ) . During the first 83 days of the study , there was a decrease in serum cholesterol ( P less than 0.01 ) and serum triglycerides ( P less than 0.05 ) in both groups . All changes in lipid variables were comparable in magnitude and duration in both treatment with insulin and glyburide in NIDDM patients with secondary sulfonylurea failure improves lipid metabolism to a similar degree as insulin therapy alone " Field ( clinical ) trials are indispensable . They will continue to be an ordeal . They lack glamour , they strain our re sources and our patience , and they protract the moment of truth to excruciating limits . Still , they are among the most challenging tests of our skills .... If , in major medical dilemmas the alternative is to pay the cost of perpetual uncertainty , have we really any choice ? " 1 The clinical trial of the effect of hypoglycemic agents on the vascular complications of diabetes mellitus conducted by the University Group Program ( UGDP ) for the past ten years bears witness to the truthfulness of the above quotation . The hypothesis that control of the blood glucose level prevents or delays the vascular complications of diabetes has been supported so far only by evidence which is largely conjectural . 2 Such a relationship can be documented only with prospect i ve study of patients r and omly assigned to treatment regimens that AIM The aim of our double-blind , placebo-controlled study was to compare the effect of acarbose and glibenclamide on the insulin sensitivity in type 2 diabetes . METHODS We investigated 77 patients ( mean age 58.7 years , mean BMI 27.3 kg/m2 ) , treated by diet alone for at least 4 weeks . The subjects were r and omized into three treatment groups for 16 weeks : 100 mg t.i.d . acarbose ( n = 25 ) or 1 mg t.i.d . glibenclamide ( n = 27 ) or one t.i.d . placebo ( n = 25 ) . Before and after therapy , the levels of fasting plasma glucose , glycosylated haemoglobin , fasting insulin , plasma glucose and insulin 1 h after a st and ardized breakfast were measured and insulin sensitivity determined by euglycaemic hyperinsulinaemic clamp test . RESULTS After the treatment period , BMI in the acarbose and placebo group decreased significantly , whereas in the glibenclamide group a significant increase was observed . Fasting plasma glucose was only significant reduced under glibenclamide . The postpr and ial glucose decreased significantly after acarbose ( 13.8 vs. 11.4 mmol/l , p < 0.05 ) and glibenclamide treatment ( 14.6 vs. 11.4 mmol/l , p < 0.05 ) and was unchanged under placebo ( 13.8 vs. 13.7 mmol/l ) . The fasting insulin levels remained unchanged in all three groups , whereas postpr and ial insulin values increased significantly under glibenclamide . Neither acarbose nor glibenclamide significantly changed insulin sensitivity [ acarbose : glucose disposal rate before treatment 2.3 mg/kg body weight/min/insulin , after treatment 3.2 ; glibenclamide 2.2 vs. 2.1 ; placebo 2.6 vs. 3.0 ] . CONCLUSIONS Our results show a more substantial improvement of glucose control under glibenclamide than under acarbose which , however , was not associated with an increase of insulin sensitivity This study examined the potential beneficial effects of the addition of a second-generation sulfonylurea to insulin therapy for poorly controlled type II diabetes . A r and omized , double-blind , crossover experimental design was utilized in 16 type II diabetic patients for a period of eight months . Treatment with glyburide , 20 mg/d ( plus insulin ) , compared with placebo ( plus insulin ) result ed in a significant reduction in mean basal glucose ( 232 + /- 12 vs 262 + /- 11 mg/dL [ 12.8 vs 14.4 mmol/L ] ) and hemoglobin A1C ( 10.2 % + /- 0.5 % vs 10.9 % + /- 03 % ) concentrations . Concomitant with this change , basal C-peptide and free insulin values increased with glyburide therapy , but this pharmacological agent did not alter the ability of the patient 's erythrocytes to bind insulin . We conclude that in type II diabetic subjects receiving more than 28 units of insulin per day , the addition of glyburide results in a marginal , but statistically significant improvement in basal glucose concentration , but not in glucose tolerance as assessed by integrated glucose concentration . Whether this small improvement in glycemia is worth the additional cost of sulfonylureas or the risk of drug side effects is not known AIMS This study evaluated the effects on glycemic control of the addition of 2.5 mg glipizide GITS to metformin in patients with mild-to-moderate , but suboptimally controlled type 2 diabetes . METHODS In this multicenter , double-blind , placebo-controlled study , 122 patients with type 2 diabetes inadequately controlled ( A1c 7 - 8.5 % ) on metformin ( > or = 1000 mg/day for > or =3 months ) were r and omized to 16 weeks treatment with 2.5 mg/day glipizide GITS ( n=61 ) or placebo ( n=61 ) , in addition to their current metformin dose . The primary efficacy variable was the change in A1c from baseline to endpoint . Changes in fasting plasma glucose ( FPG ) , insulin concentrations , lipid profile and safety variables were also measured . RESULTS The addition of glipizide GITS to metformin gave significantly greater improvements in mean A1c and FPG from baseline to endpoint than placebo addition ( p<0.0002 ) . Significantly more patients in the glipizide GITS group than in the placebo group achieved the target A1c level of A1c<7.0 % ( p<0.0001 ) and an A1c<6.5 % ( p<0.0033 ) . Fasting insulin concentrations were similar in both groups and unchanged by treatment . Addition of glipizide GITS to metformin did not produce any significant or clinical ly relevant weight gain or changes in BMI . Both treatment regimens were well tolerated . CONCLUSIONS This study showed that the addition of 2.5 mg glipizide GITS to metformin significantly improved glucose control in patients with type 2 diabetes inadequately controlled by metformin monotherapy Type 2 diabetes is characterized by increased acute phase serum proteins . We wanted to study how these proteins are related to complement activation in type 2 diabetes and how improvement of glycemic control affects them or complement activation . A total of 29 type 2 diabetic patients ( age , 55.2 + /- 1.8 years , glycosylated hemoglobin [ HbA(1c ) ] 8.9 % + /- 0.2 % , body mass index [ BMI ] 30.9 + /- 0.8 kg/m(2 ) , duration 5.9 + /- 1.3 years ) participated in the study . They were previously treated either with diet alone or in combination with 1 oral antihyperglycemic medication . After a period of at least 4 weeks run-in on diet only , the patients were r and omized to pioglitazone , glibenclamide , or placebo . Blood sample s were taken before the treatments and at the end of the 6-month therapy . Basal C-reactive protein ( CRP ) level was related to acylation-stimulating protein ( ASP ) concentration ( r = .55 , P < .01 ) , and many acute phase serum protein concentrations were associated with each other . The treatment reduced HbA(1c ) level in the pioglitazone ( from 9.1 + /- 0.3 % to 8.0 + /- 0.5 % , P < .05 ) and glibenclamide ( from 8.9 % + /- 0.3 % to 7.7 % + /- 0.2 % , P < .05 ) groups . Glibenclamide treatment was associated with a reduction in alpha-1-antitrypsin ( P < .05 ) , ceruloplasmin ( P < .01 ) , and complement C3 protein ( C3 ) ( P < .05 ) . Although ASP did not change significantly in any of the treatment subgroups , in the whole patient population , the change in HbA(1c ) during the treatments correlated positively with the change in ASP , ( r = .43 , P < .05 ) . The changes in many acute phase serum proteins and ASP were related to each other . In conclusion , ( 1 ) inflammatory factors and complement activation are associated in patients with type 2 diabetes , and ( 2 ) changes in hyperglycemia are related to changes in the concentration of the complement activation product , ASP The effect of combined insulin-glibenclamide therapy on glucose control was evaluated in a double-blind placebo controlled study of 20 patients with non-insulin-dependent diabetes mellitus ( NIDDM ) and second failure to oral antidiabetic therapy with glibenclamide or glipizide . After an observation period of 1 - 3 months , insulin treatment was initiated which result ed in rapid improvement of the glycemic control within 6 weeks . Thereafter glibenclamide or placebo was added to insulin for a further 12 weeks . Glibenclamide improved the glycemic control as expressed by a diminution of blood glucose and HbA1c . This was observed in spite of the fact that the daily insulin dose was reduced by approximately 30 % in the glibenclamide-treated group of patients . It is concluded that in NIDDM patients with second failure to glibenclamide ot glipizide therapy , the responsiveness to glibenclamide may be at least partially restored by a short period of insulin treatment . It is suggested that therapy with insulin and glibenclamide is an appropriate treatment regimen for NIDDM patients with second failure to sulfonylurea therapy
11,054
26,018,220
Both the type and structure of capacity-building strategies may have influenced effectiveness . The review also identified context ual factors that may require variations in the ways capacity-building interventions are design ed .
Background Numerous agencies are providing training , technical assistance , and other support to build community-based practitioners ’ capacity to adopt and implement evidence -based prevention interventions . Yet , little is known about how best to design capacity-building interventions to optimize their effectiveness . W and ersman et al. ( Am J Community Psychol.50:445–59 , 2102 ) proposed the Evidence -Based System of Innovation Support ( EBSIS ) as a framework to guide research and thereby strengthen the evidence base for building practitioners ’ capacity . The purpose of this review was to contribute to further development of the EBSIS by systematic ally review ing empirical studies of capacity-building interventions to identify ( 1 ) the range of strategies used , ( 2 ) variations in the way they were structured , and ( 3 ) evidence for their effectiveness at increasing practitioners ’ capacity to use evidence -based prevention interventions .
The goal of evidence -based medicine is ultimately to improve patient outcomes and quality of care . Systematic review s of the available published evidence are required to identify interventions that lead to improvements in behavior , health , and well-being . Authoritative literature review s depend on the quality of published research and research reports . The Consoli date d St and ards for Reporting Trials ( CONSORT ) Statement ( www.consort-statement.org ) was developed to improve the design and reporting of interventions involving r and omized clinical trials ( RCTs ) in medical journals . We describe the 22 CONSORT guidelines and explain their application to behavioral medicine research and to evidence -based practice . Additional behavioral medicine-specific guidelines ( e.g. , treatment adherence ) are also presented . Use of these guidelines by clinicians , educators , policymakers , and research ers who design , report , and evaluate or review RCTs will strengthen the research itself and accelerate efforts to apply behavioral medicine research to improve the processes and outcomes of behavioral medicine practice Background Tobacco remains a seemingly intractable problem for individuals living with severe and persistent mental illness . This study evaluated the implementation , technical assistance , and perceived impact of a model curriculum ( " Learning About Healthy Living " ) to promote wellness and motivation to quit tobacco use in psychosocial rehabilitation clubhouses . Methods We used semi-structured interviews ( n = 9 ) with clubhouse staff ( n = 12 ) and a survey of participating clubhouse members ( n = 271 ) in nine clubhouses . Results Fifty-eight percent of clubhouse participants completed surveys . Results showed tobacco users open to tobacco-free policies ( 62 % ) and perceiving more discussion s about quitting tobacco with healthcare providers ( 69 % ) . Analyses of staff interviews and member surveys revealed four key themes : ( 1 ) the curriculum was successfully implemented and appreciated ; ( 2 ) technical assistance kept implementation on track ; ( 3 ) adding wellness content and interactive components should enhance the curriculum ; and , ( 4 ) the curriculum advanced other healthful policies and practice s. Conclusions Mental health setting s are important locations for implementing programs to address tobacco use . In this real-world implementation of a model curriculum in psychosocial rehabilitation clubhouses , the curriculum tested well , was feasible and well-received , and suggests potential impact on tobacco use outcomes . Revision , dissemination , and a r and omized controlled trial evaluation of the model curriculum should now occur BACKGROUND Evidence -based public health decision-making depends on high quality and transparent accounts of what interventions are effective , for whom , how and at what cost . Improving the quality of reporting of r and omized and non-r and omized study design s through the CONSORT and TREND statements has had a marked impact on the quality of study design s. However , public health users of systematic review s have been concerned with the paucity of synthesized information on context , development and rationale , implementation processes and sustainability factors . METHODS This paper examines the existing reporting frameworks for research against information sought by users of systematic review s of public health interventions and suggests additional items that should be considered in future recommendations on the reporting of public health interventions . RESULTS Intervention model , theoretical and ethical considerations , study design choice , integrity of intervention/process evaluation , context , differential effects and inequalities and sustainability are often overlooked in reports of public health interventions . CONCLUSION Population health policy makers need synthesized , detailed and high quality a priori accounts of effective interventions in order to make better progress in tackling population morbidities and inequalities . Adding simple criteria to reporting st and ards will significantly improve the quality and usefulness of published evidence and increase its impact on public health program planning OBJECTIVES We examined the effect of community coalition network structure on the effectiveness of an intervention design ed to accelerate the adoption of evidence -based substance abuse prevention programs . METHODS At baseline , 24 cities were matched and r and omly assigned to 3 conditions ( control , satellite TV training , and training plus technical assistance ) . We surveyed 415 community leaders at baseline and 406 at 18-month follow-up about their attitudes and practice s toward substance abuse prevention programs . Network structure was measured by asking leaders whom in their coalition they turned to for advice about prevention programs . The outcome was a scale with 4 subscales : coalition function , planning , achievement of benchmarks , and progress in prevention activities . We used multiple linear regression and path analysis to test hypotheses . RESULTS Intervention had a significant effect on decreasing the density of coalition networks . The change in density subsequently increased adoption of evidence -based practice s. CONCLUSIONS Optimal community network structures for the adoption of public health programs are unknown , but it should not be assumed that increasing network density or central ization are appropriate goals . Lower-density networks may be more efficient for organizing evidence -based prevention programs in communities This article describes process evaluation methods for the Pool Cool diffusion trial across 4 years . Pool Cool is a skin cancer prevention program that was found to improve behaviors and environments for sun protection at swimming pools in a r and omized efficacy trial , which was followed by a national diffusion trial . The process evaluation focus shifted from measuring program satisfaction to assessing widespread program implementation , barriers and facilitators to implementation , and program maintenance and sustainability . Data collection methods include training surveys , data base tracking , field coordinator activity logs , e-mails , surveys of parents , lifeguards and pool managers , and process evaluation interviews and site visits . The data revealed high levels of implementation of major program components when disseminated in the diffusion trial , including sun safety lessons , sun safety signs , and sunscreen use . This article describes program features and participant factors that facilitated local implementation , maintenance and sustainability across dispersed pools such as linkage agents , a packaged program , and adaptations of program elements Background Studies have shown that communities have not always been able to implement evidence -based prevention programs with quality and achieve outcomes demonstrated by prevention science . Implementation support interventions are needed to bridge this gap between science and practice . The purpose of this article is to present two-year outcomes from an evaluation of the Assets Getting To Outcomes ( AGTO ) intervention in 12 Maine communities engaged in promoting Developmental Assets , a positive youth development approach to prevention . AGTO is an implementation support intervention that consists of : a manual of text and tools ; face-to-face training , and onsite technical assistance , focused on activities shown to be associated with obtaining positive results across any prevention program . Methods This study uses a nested and cross-sectional , cluster r and omized controlled design . Participants were coalition members and program staff from 12 communities in Maine . Each coalition nominated up to five prevention programs to participate . At r and om , six coalitions and their respective 30 programs received the two-year AGTO intervention and the other six maintained routine operations . The study assessed prevention practitioner capacity ( efficacy and behaviors ) , practitioner exposure to and use of AGTO , practitioner perceptions of AGTO , and prevention program performance . Capacity of coalition members and performance of their programs were compared between the two groups across the baseline , one- , and two-year time points . Results We found no significant differences between AGTO and control group ’s prevention capacity . However , within the AGTO group , significant differences were found between those with greater exposure to and use of AGTO . Programs that received the highest number of technical assistance hours showed the most program improvement . Conclusions This study is the first of its kind to show that use of an implementation support intervention-AGTO -yielded improvements in practitioner capacity and consequently in program performance on a large sample of practitioners and programs using a r and omized controlled design . Clinical Trials.gov Underage drinking is a significant problem facing US communities . Several environmental alcohol prevention ( EAP ) strategies ( laws , regulations , responsible beverage service training and practice s ) successfully address underage drinking . Communities , however , face challenges carrying out these EAP strategies effectively . This small-scale , 3-year , r and omized controlled trial assessed whether providing prevention coalitions with Getting To Outcomes –Underage Drinking ( GTO-UD ) , a tool kit and implementation support intervention , helped improve implementation of two common EAP strategies , responsible beverage service training ( RBS ) and compliance checks . Three coalitions in South Carolina and their RBS and compliance check programs received the 16-month GTO-UD intervention , including the GTO-UD manual , training , and onsite technical assistance , while another three in South Carolina maintained routine operations . The measures , collected at baseline and after the intervention , were a structured interview assessing how well coalitions carried out their work and a survey of merchant attitudes and practice s in the six counties served by the participating coalitions . Over time , the quality of some RBS and compliance check activities improved more in GTO-UD coalitions than in the control sites . No changes in merchant practice s or attitudes significantly differed between the GTO-UD and control groups , although merchants in the GTO-UD counties did significantly improve on refusing sales to minors while control merchants did not This paper describes the extent to which communities implementing the Communities That Care ( CTC ) prevention system adopt , replicate with fidelity , and sustain programs shown to be effective in reducing adolescent drug use , delinquency , and other problem behaviors . Data were collected from directors of community-based agencies and coalitions , school principals , service providers , and teachers , all of whom participated in a r and omized , controlled evaluation of CTC in 24 communities . The results indicated significantly increased use and sustainability of tested , effective prevention programs in the 12 CTC intervention communities compared to the 12 control communities , during the active phase of the research project when training , technical assistance , and funding were provided to intervention sites , and 2 years following provision of such re sources . At both time points , intervention communities also delivered prevention services to a significantly greater number of children and parents . The quality of implementation was high in both conditions , with only one significant difference : CTC sites were significantly more likely than control sites to monitor the quality of implementation during the sustainability phase of the project One of the important research issues in the emerging area of research on dissemination of prevention programs relates to the type and extent of training needed by program providers to prepare them to implement effective programs with fidelity . The present paper describes the immediate outcomes of a dissemination and implementation trial of Project Toward No Drug Abuse , an evidence -based prevention program for high school students . A total of 65 high schools in 14 school districts across the USA were recruited and r and omly assigned to one of three experimental conditions : comprehensive implementation support for teachers , regular workshop training only , or st and ard care control . The comprehensive intervention was comprised of on-site coaching , web-based support , and technical assistance , in addition to the regular workshop . Students ( n = 2,983 ) completed self-report surveys before and immediately after program implementation . Fidelity of implementation was assessed with a classroom observation procedure that focused on program process . Results indicated that relative to the controls , both intervention conditions produced effects on hypothesized program mediators , including greater gains in program-related knowledge ; greater reductions in cigarette , marijuana and hard drug use intentions ; and more positive changes in drug-related beliefs . There were stronger effects on implementation fidelity in the comprehensive , relative to the regular , training condition . However , seven of the ten immediate student outcome measures showed no significant differences between the two training conditions . The implication s of these findings for dissemination research and practice are discussed There continues to be a gap in prevention outcomes achieved in research trials versus those achieved in “ real-world ” practice . This article reports interim findings from a r and omized controlled trial evaluating Assets-Getting To Outcomes ( AGTO ) , a two-year intervention design ed to build prevention practitioners ’ capacity to implement positive youth development – oriented practice s in 12 community coalitions in Maine . A survey of coalition members was used to assess change on individual practitioners ’ prevention capacity between baseline and one year later . Structured interviews with 32 program directors ( 16 in the intervention group and 16 in the control group ) were used to assess changes in programs ’ prevention practice s during the same time period . Change in prevention capacity over time did not differ significantly between the intervention and control groups . However , in secondary analyses of only those assigned to the AGTO intervention , users showed greater improvement in their self-efficacy to conduct Assets – based programming and increases in the frequency with which they engaged in AGTO behaviors , whereas among non-users , self-efficacy to conduct Assets – based programming declined . Interview ratings showed improvement in several key areas of performance among intervention programs . Improvement was associated with the number of technical assistance hours received . These results suggest that , after one year , AGTO is beginning to improve the capacity of community practitioners who make use of it There has been relatively little research on effects of interventions aim ed directly at improving internal community coalition functioning , particularly in the area of planning for adoption of evidence -based prevention programs . The current study investigated the effect of Project STEP , a prevention diffusion trial , on three factors hypothesized to improve coalition prevention planning ( quality of coalition plans , extent of plan implementation , and committee internal functioning in meetings ) . Cities were r and omly assigned to one of three conditions ( televised training with limited technical assistance , televised training alone , or control ; n = 24 ) . Results demonstrated that at 1.5 year follow-up , coalitions in the two intervention groups showed more effective prevention plans , plan implementation , and functioning in meetings than control coalitions . Group differences were maintained at 3-year follow-up , albeit at decreased levels , for quality of planning and implementation . The findings suggest that building coalition capacity to diffuse evidence -based prevention programs works at least partially by increasing the effectiveness of coalition functioning , and that booster training may be warranted within 3 years after initial training OBJECTIVES AIDS service organizations ( ASOs ) rarely have access to the information needed to implement research -based HIV prevention interventions for their clients . We compared the effectiveness of 3 dissemination strategies for transferring HIV prevention models from the research arena to community providers of HIV prevention services . METHODS Interviews were conducted with the directors of 74 ASOs to assess current HIV prevention services . ASOs were r and omized to programs that provided ( 1 ) technical assistance manuals describing how to implement research -based HIV prevention interventions , ( 2 ) manuals plus a staff training workshop on how to conduct the implementation , or ( 3 ) manuals , the training workshop , and follow-up telephone consultation calls . Follow-up interviews determined whether the intervention model had been adopted . RESULTS The dissemination package that provided ASOs with implementation manuals , staff training workshops , and follow-up consultation result ed in more frequent adoption and use of the research -based HIV prevention intervention for gay men , women , and other client population s. CONCLUSIONS Strategies are needed to quickly transfer research -based HIV prevention methods to community providers of HIV prevention services . Active collaboration between research ers and service agencies results in more successful program adoption than distribution of implementation packages alone There is a knowledge gap concerning how well community-based teams fare in implementing evidence -based interventions ( EBIs ) over many years , a gap that is important to fill because sustained high quality EBI implementation is essential to public health impact . The current study addresses this gap by evaluating data from PROSPER , a community-university intervention partnership model , in the context of a r and omized-control trial of 28 communities . Specifically , it examines community teams ’ sustainability of implementation quality on a range of measures , for both family-focused and schoolbased EBIs . Average adherence ratings approached 90 % for family-focused and school-based EBIs , across as many as 6 implementation cohorts . Additional indicators of implementation quality similarly showed consistently positive results . Correlations of the implementation quality outcomes with a number of characteristics of community teams and intervention leaders were calculated to explore their potential relevance to sustained implementation quality . Though several relationships attained statistical significance at particular points in time , none were stable across cohorts . The role of PROSPER ’s continuous , proactive technical assistance in producing the positive results is discussed BACKGROUND In 2002 , CDC recommended that the nation 's schools establish policies that reduce sun exposure to decrease students ' risk of skin cancer . PURPOSE A program to convince public school districts to adopt such a policy was evaluated . DESIGN RCT . SETTING / PARTICIPANTS Public school districts in Colorado ( n=56 ) and Southern California ( n=56 ) . INTERVENTION Policy information , tools , and technical assistance were provided through printed material s , a website , meetings with administrators , and presentations to school boards . An RCT enrolled public school districts from 2005 to 2010 . Policy adoption was promoted over 2 years at districts r and omized to the intervention . MAIN OUTCOME MEASURES School board-approved policies were obtained from 106 districts and coded at baseline and 2-year follow-up . Analyses were conducted in 2010 . RESULTS There was no difference in the percentage of districts adopting a policy ( 24 % in intervention ; 12 % in control ; p=0.142 ) ; however , intervention districts ( adjusted M=3.10 of 21 total score ) adopted stronger sun safety policies than control districts ( adjusted M=1.79 ; p=0.035 ) . Policy categories improved on sun safety education for students ( intervention adjusted M=0.76 ; control adjusted M=0.43 , p=0.048 ) ; provision of outdoor shade ( intervention adjusted M=0.79 ; control adjusted M=0.28 , p=0.029 ) ; and outreach to parents ( intervention adjusted M=0.59 ; control adjusted M=0.20 , p=0.027 ) . CONCLUSIONS Multifaceted promotion can increase adoption of stronger policies for reducing sun exposure of students by public school districts . Future research should explore how policies are implemented by schools BACKGROUND The Guide to Community Preventive Services ( Community Guide ) offers evidence -based intervention strategies to prevent chronic disease . The American Cancer Society ( ACS ) and the University of Washington Health Promotion Research Center co-developed ACS Workplace Solutions ( WPS ) to improve workplaces ' implementation of Community Guide strategies . PURPOSE To test the effectiveness of WPS for midsized employers in low-wage industries . DESIGN Two-arm RCT ; workplaces were r and omized to receive WPS during the study ( intervention group ) or at the end of the study ( delayed control group ) . SETTING / PARTICIPANTS Forty-eight midsized employers ( 100 - 999 workers ) in King County WA . INTERVENTION WPS provides employers one-on-one consulting with an ACS interventionist via three meetings at the workplace . The interventionist recommends best practice s to adopt based on the workplace 's current practice s , provides implementation toolkits for the best practice s the employer chooses to adopt , conducts a follow-up visit at 6 months , and provides technical assistance . MAIN OUTCOME MEASURES Employers ' implementation of 16 best practice s ( in the categories of insurance benefits , health-related policies , programs , tracking , and health communications ) at baseline ( June 2007-June 2008 ) and 15-month follow-up ( October 2008-December 2009 ) . Data were analyzed in 2010 - 2011 . RESULTS Intervention employers demonstrated greater improvement from baseline than control employers in two of the five best- practice categories ; implementing policies ( baseline scores : 39 % program , 43 % control ; follow-up scores : 49 % program , 45 % control ; p=0.013 ) and communications ( baseline scores : 42 % program , 44 % control ; follow-up scores : 76 % program , 55 % control ; p=0.007 ) . Total best- practice implementation improvement did not differ between study groups ( baseline scores : 32 % intervention , 37 % control ; follow-up scores : 39 % intervention , 42 % control ; p=0.328 ) . CONCLUSIONS WPS improved employers ' health-related policies and communications but did not improve insurance benefits design , programs , or tracking . Many employers were unable to modify insurance benefits and reported that the time and costs of implementing best practice s were major barriers . TRIAL REGISTRATION This study is registered at clinical trials.gov NCT00452816 Skin cancer is highly preventable , but clearly there is a critical need to focus on better ways to disseminate information about known skin cancer prevention . The U.S. Environmental Protection Agency ’s ( EPA ) SunWise Program is one channel for reaching children , teachers , and school nurses . In a pilot study design ed to increase adoption of school-based sun protection policies , 28 schools were r and omly assigned to one of three groups : Control , which included the EPA ’s original SunWise curriculum toolkit ; SunWise Policy , which included a revised toolkit emphasizing policy ; and SunWise Policy plus Technical Assistance , which included the policy toolkit and 3 technical assistance phone calls . The enhanced SunWise Policy plus Technical Assistance intervention led to more new sun protection policies . Use of study interventions for improving sun protection practice s such as policy toolkits or brief counseling can be easily interwoven into school hours by school nurses and other health educators
11,055
12,144,591
The main efficacy outcome reported in the studies was the presence of cocaine metabolites in the urine . No significant results were found , regardless the type of drug or dose used for all relevant outcomes assessed . There is no current evidence supporting the clinical use of CBZ , antidepressants , dopamine agonists , disulfiram , mazindol , phenytoin , nimodipine , lithium and NeuRecover-SA in the treatment of cocaine dependence .
AIMS Cocaine dependence is a common and serious condition , associated with severe medical , psychological and social problems , including the spread of infectious diseases . This systematic review assesses critically the efficacy of pharmacotherapy for treating cocaine dependence .
We performed a double-blind , placebo-controlled , r and omized 12-week trial of desipramine hydrochloride treatment of cocaine dependence among methadone-maintained patients . Fifty-nine patients completed the 12-week medication trial ( 36 received desipramine and 23 received placebo ) , and 94 % were recontacted 1 , 3 , and 6 months after treatment . There were significantly more dropouts in the desipramine than in the placebo group . Baseline to 12-week comparisons of Addiction Severity Index interview data indicated that both groups showed improvements . At 12 weeks , the desipramine group showed significantly better psychiatric status than the placebo group but did not differ from the placebo group on any of 21 other outcome measures , including cocaine use . During the 12-week medication phase and at the 1-month follow-up evaluation , urine toxicology screenings showed no significant difference between groups , but the placebo group had significantly less cocaine use at both the 3- and 6-month follow-up points . We conclude that desipramine has few benefits with regard to control of cocaine use in this population A 4-week , double-blind , placebo-controlled trial of amantadine was conducted in 61 cocaine dependent out patients . Subjects received 100 mg of amantadine 3 times daily . A follow-up visit was conducted at week 8 . There were no significant differences between groups in treatment retention , or in the number of benzoylecgonine positive urine sample s. Self-reported drug and alcohol use declined in both groups . At week 8 follow-up , self-reported drug use was significantly lower in the placebo group . Amantadine was not effective , and discontinuation of it may have been associated with an increase in cocaine use An open field trial was conducted comparing desipramine and an active placebo in separate population s of chronic cocaine and phencyclidine ( PCP ) abusers , who discontinued their abuse . Subjects who received desipramine showed a decrease in depressive symptoms after a 20 - 40 day period regardless of whether they abused PCP or cocaine Summary : Recent pre clinical studies suggest utility for voltage-sensitive calcium channel blockers ( VSCCBs ) in the treatment of cocaine addiction . The following double-blind placebo-controlled study examined the role of the VSCCB nimodipine in attenuating cocaine craving in 66 recently abstinent cocaine-dependent patients on an inpatient substance abuse treatment unit utilizing an intensive 12-step milieu-oriented psychosocial therapy . While the medication was well tolerated , the dose of nimodipine used in this study ( 90 mg q.d . ) was not superior to placebo in reducing background or cue-induced cocaine craving over the 3 weeks of the study . There was the suggestion that nimodipine might attenuate the severity of some cocaine-induced brain deficits , as detected by evaluation of smooth pursuit eye movement function . A rationale for evaluating higher doses of nimodipine for the treatment of cocaine addiction is presented . As nimodipine might have anticraving and moodstabilizing properties and cardio- and neuroprotective properties in the face of cocaine intoxication and might possibly even reverse some cocaine-induced brain deficits , further investigation of the role of nimodipine ( and other VSCCBs ) in cocaine addiction appears an attractive avenue of future medication development Three methods of analysis were used to determine the effects of the combination of counseling with fluoxetine ( 20 , 40 , or 60 mg ) and " active " placebo ( diphenhydramine , 12.5 mg ) r and omly assigned . Forty-five cocaine-only dependent subjects were treated as out patients with " interpersonal " counseling , medication , and drug use monitoring three times per week for up to 12 weeks . Treatment effects were analyzed : first , by comparing the three original assignments and placebo ; second , by comparing the placebo group to fluoxetine subjects with detectable fluoxetine/norfluoxetine blood levels and those with no detectable medication blood level ; third , by examining relapse prevention versus use cessation through stratifying the subjects into four groups according to fluoxetine or placebo assignment and initial urine cocaine positivity or negativity . All three analyses showed improvement on some measures over time regardless of group assignment . The 60-mg fluoxetine group showed least effectiveness , the group with detectable blood levels had less cravings , and the fluoxetine subjects who were abstinent at the start of treatment were somewhat less likely to avoid relapse than those on placebo BACKGROUND We examined the effects of disulfiram versus placebo on cocaine dependence in buprenorphine-maintained subjects . METHODS Opioid and cocaine dependent subjects ( n = 20 ) were induced onto buprenorphine maintenance , then r and omized to disulfiram ( 250 mg q.d . ; n = 11 ) or placebo ( n = 9 ) treatment for 12 weeks . RESULTS Groups were comparable at baseline on demographic measures and on baseline measures of drug-use severity . Fifteen subjects completed the study , including 8 subjects r and omized to disulfiram ( 72.7 % ) and 7 subjects r and omized to placebo ( 77.8 % ) . The total number of weeks abstinent from cocaine was significantly greater on disulfiram versus placebo ( mean + /- SD : 7.8 + /- 2.6 vs. 3.3 + /- 0.5 , p < .05 ) and the number of days to achieving 3 weeks ( 24.6 + /- 15.1 vs. 57.8 + /- 7.7 , p < .01 ) of continuous cocaine abstinence was significantly lower in disulfiram compared with placebo . The number of cocaine-negative urine tests during the trial were also higher on disulfiram ( 14.7 ) than on placebo ( 8.6 ) ; furthermore , subjects in the disulfiram group achieved consistently higher rates of cocaine-negative urine tests in each 3-week interval and the increase over time was faster in the disulfiram compared with placebo . CONCLUSIONS This preliminary study suggests the potential efficacy of disulfiram versus placebo for treatment of cocaine dependence in buprenorphine-maintained patients Thirty-six male cocaine abusers , in withdrawal , were studied for 99 days in a double-blind design . Treatment with bromocriptine was significantly more effective than placebo in alleviating withdrawal symptoms . Adding desipramine to the bromocriptine regimen was significantly more effective than either placebo or bromocriptine alone . The authors hypothesize that these results support a model of dopamine depletion and receptor supersensitivity in cocaine withdrawal A partial blockade of the multiple actions of cocaine is one strategy by which cocaine dependence may be treated . Risperidone , a 5-hydroxytryptamine and dopamine D2 antagonist , is an atypical antipsychotic and was a c and i date medication for the treatment of cocaine dependence . One hundred ninety-three cocaine-dependent subjects were enrolled in a 12-week , r and omized , double-blind , placebo-controlled trial . Subjects initially received either placebo or 4 or 8 mg of risperidone , with a subsequent change to active doses of 2 mg and 4 mg . Subjects attended the clinic twice each week , provided urine sample s , obtained medication , and underwent one behavioral therapy session per week . The study was terminated at the interim analysis . Retention was worse for the 4- and 8-mg active medication groups . Side effects were primarily associated with the 8-mg dose , although neither 2 mg nor 4 mg was well accepted by subjects . There was no reduction in cocaine use associated with risperidone . The results suggest that although antagonists might be a useful treatment approach , such as in the treatment of opiate dependence , risperidone is unlikely to find broad acceptance with the treatment-seeking population We conducted a multi-site , placebo-controlled , r and omized double-blind clinical trial comparing bupropion HCL ( 300 mg/day ) to placebo for the treatment of cocaine dependence in methadone-maintained subjects . A total of 149 subjects at three sites participated in a 12-week study . Outcome measures included cocaine use , level of depression , and psychosocial functioning . Results showed no significant differences between placebo and bupropion . Exploratory analyses suggested a medication effect for the subset of subjects depressed at study entry . The need to target subgroups of cocaine abusers in future pharmacotherapy trials and the possible role of treatment readiness are discussed This study was conducted to determine the effectiveness of carbamazepine ( CBZ ) for treatment of cocaine dependence . Sixty-two ( CBZ = 28 , placebo = 34 ) cocaine-dependent ( DSM-III-R criteria ) volunteers consented to be treated for eight weeks with st and ardized outpatient individual counseling twice a week plus double-blind CBZ or inactive placebo . During the 8-week trial , both groups showed increased number of urine sample s negative for cocaine , significantly ( P < 0.01 ) decreased self-reported cocaine use ( money spent and grams used ) , and decreased Beck Depression Inventory and Symptom Check List-90-Revised ( SCL-90-R ) total scores . However , there were no significant differences between CBZ and placebo . This study does not support the effectiveness of CBZ for outpatient treatment of cocaine dependence Twenty-four cocaine addicts who experienced withdrawal symptoms were studied for six weeks in a double-blind design . Half of the group received daily treatment with bromocriptine and the other half with placebo . Significant relief with bromocriptine was seen almost immediately and continued throughout the detoxification period . The authors speculate that the results are consistent with the " dopamine-depletion model " of cocaine withdrawal An interim analysis of 41 evaluable patients compared gepirone to placebo treatment in a r and omized , double-blind , 12-week study of cocaine dependence without opiate abuse . The response to gepirone at a mean dose of 16.25 mg/day did not differ from placebo by measures of time in study , positive urine cocaine screens ( greater than 6 weeks ) , Clinical Global Impressions ( CGI ) Global Improvements Scale , Cocaine Craving Scale ( CCS ) , Quantitative Cocaine Inventory ( QCI ) , Addiction Severity Index ( ASI ) , Global Assessment Scale ( GAS ) , Hamilton Rating Scale for Depression ( HAM-D ) , and Hamilton Anxiety Scale ( HAM-A ) . Both treatment groups showed similar modest , average improvements during the study in all treatment measures . Adverse events were not treatment limiting . The following demographic and study measures suggested favorable trends for study outcomes : older age , divorced status , higher pre-treatment cocaine use , lower CCS scores , and lower self-reports of cocaine use according to QCI Twenty male combinative cocaine free-base/phencyclidine ( space-base ) abusers were studied for forty-five days , in a double-blind design . Treatment with desipramine was significantly more effective than placebo in alleviating abstinence symptoms . This study tends to support the catecholamine-depletion hypothesis of cocaine and phencyclidine withdrawal A 12-week placebo-controlled , r and omized clinical trial was undertaken to evaluate imipramine as a treatment for cocaine abuse , and to examine whether its effect may be limited to subgroups defined by route of use or by diagnosis of depression . One-hundred thirteen patients were r and omized , stratified by route of use and depression . All patients received weekly individual counseling . Compared to placebo the imipramine group showed greater reductions in cocaine craving , cocaine euphoria , and depression , but the effect of imipramine on cocaine use was less clear . A favorable response , defined as at least 3 consecutive , urine-confirmed , cocaine-free weeks was achieved by 19 % ( 11/59 ) of patients on imipramine compared to 7 % ( 4/54 ) on placebo ( P < 0.09 ) . The imipramine effect was greater among nasal users--33 % ( 9/27 ) response on imipramine vs. 5 % ( 1/22 ) on placebo ( P < 0.02 ) . Response was also more frequent , but not significantly so , among depressed users on imipramine ( 26 % , 10/38 ) than on placebo ( 13 % , 4/31 ) ( P < 0.19 ) . Response rates were low in intravenous and freebase users and those without depression . Considered together with the literature on desipramine , these data suggest tricyclic antidepressants are not promising as a mainstay of treatment for unselected cocaine abusers . However , tricyclics may be useful for selected cocaine abusers with comorbid depression or intranasal use , or in conjunction with a more potent psychosocial intervention We conducted a single-blind , r and om assignment , placebo-controlled , 12-week comparison of desipramine hydrochloride and amantadine hydrochloride as adjunctive treatments to counseling for cocaine dependence . Subjects were 54 out patients who met DSM III-R criteria for active cocaine dependence and who completed a minimum of 2 weeks of treatment . Subjects treated with fixed doses of 200 mg/day desipramine ( N = 17 ) , 400 mg/day amantadine-placebo ( N = 16 ) , and placebo ( N = 21 ) did not differ for lifetime cocaine use , lifetime histories of psychopathology , admission scores on psychometric assessment s , and sociodemographics . All treatment groups demonstrated dramatic and persistent decreases in cocaine use , craving for cocaine , and psychiatric symptoms consequent to treatment . Although there was a trend for more dropouts by subjects taking desipramine , there were no significant differences among treatment groups regarding retention in treatment , craving for cocaine , and decreased cocaine use confirmed by urine toxicology . There was a trend for subjects treated with desipramine to maintain longer periods of cocaine abstinence . Mean plasma concentration of desipramine in a sub sample of our subjects was less than that recommended for treatment of depression , thus the dosage of desipramine may have been subtherapeutic We tested the efficacy of amantadine to reduce cocaine use or craving in cocaine-dependent methadone maintained patients . Two doses of amantadine ( 200 mg p.o . daily , n = 16 , and 200 mg p.o . bid , n = 21 ) were tested against placebo , n = 22 , in a r and om assignment , double-blind clinical trial lasting nine weeks . Amantadine was well tolerated . However , neither dose of amantadine was more effective than placebo in reducing cocaine use and craving In a double-blind , placebo-controlled 12-week r and omized clinical trial , we compared amantadine hydrochloride ( 300 mg/d ; n = 33 ) , desipramine hydrochloride ( 150 mg/d ; n = 30 ) , and placebo ( n = 31 ) in the treatment of cocaine-abusing methadone-maintained patients . Treatment retention and medication compliance were excellent , with more than 75 % of the patients completing the full 12-week trial . Although reported cocaine abuse was significantly lower in the medicated groups compared with the placebo group at week 4 , this difference became nonsignificant at week 8 , and no difference was found in cocaine-free urine sample s. Future studies of amantadine and desipramine treatment in these patients should consider alternatives to methadone hydrochloride , such as buprenorphine hydrochloride , and the selection of more homogeneous patient subgroups , such as depressed cocaine abusers Cocaine-induced kindling , which has been hypothesized to underlie cocaine-induced craving , is reversed by carbamazepine treatment . Though preliminary studies showed carbamazepine to be useful for relapse prevention in cocaine-dependent subjects , more recent studies have failed to replicate those findings . We conducted a r and omized , double-blind , placebo-controlled trial of carbamazepine 600 mg/day in 40 cocaine-dependent males . During active treatment there were no significant effects of carbamazepine on cocaine use , alcohol consumption , anxiety or depressive symptoms . At three months post-treatment carbamazepine-treated subjects reported fewer drinking days . We conclude that carbamazepine at this dose level is probably not efficacious for treatment of cocaine dependence AIMS Cocaine use by patients on methadone maintenance treatment is a widespread problem and is associated with a poorer prognosis . Recent studies have evaluated disulfiram as a treatment for individuals with comorbid alcohol and cocaine abuse . We evaluated the efficacy of disulfiram for cocaine dependence , both with and without co-morbid alcohol abuse , in a group of methadone-maintained opioid addicts . DESIGN R and omized double-blind , placebo-controlled trial . SETTING Urban methadone maintenance clinic . PARTICIPANTS Sixty-seven cocaine-dependent , methadone-maintained , opioid-dependent subjects ( 52 % female ; 51 % Caucasian ) . INTERVENTION Study medication , either disulfiram or placebo , was placed directly in the methadone to ensure compliance for 12 weeks . MEASUREMENTS Primary outcome measures included weekly assessment s of the frequency and quantity of drug and alcohol use , weekly urine toxicology screens and breathalyzer readings . FINDINGS Disulfiram treated subjects decreased the quantity and frequency of cocaine use significantly more than those treated with placebo . Alcohol use was minimal for all subjects regardless of the medication . CONCLUSIONS Disulfiram may be an effective pharmacotherapy for cocaine abuse among methadone-maintained opioid addicts , even in those individuals without co-morbid alcohol abuse . Disulfiram inhibits dopamine beta-hydroxylase result ing in an excess of dopamine and decreased synthesis of norepinephrine . Since cocaine is a potent catecholamine re-uptake inhibitor , disulfiram may blunt cocaine craving or alter the " high " , result ing in a decreased desire to use cocaine BACKGROUND At present , there is no consensus regarding effective treatment for cocaine abuse or the most productive roles for the two major forms of treatment , pharmacotherapy and psychotherapy . We conducted the first r and omized clinical trial evaluating psychotherapy and pharmacotherapy , alone and in combination , as treatment for ambulatory cocaine abusers . METHODS One hundred thirty-nine subjects were assigned to one of four conditions offered over a 12-week abstinence initiation trial : relapse prevention plus desipramine hydrochloride , clinical management plus desipramine , relapse prevention plus placebo , and clinical management plus placebo . All treatments were manual-guided , delivered by experienced therapists , and monitored to promote the integrity of both forms of treatment . RESULTS First , although all groups showed significant improvement , significant main effects for medication or psychotherapy , or their combination , were not found for treatment retention , reduction in cocaine use , or other outcomes at 12 weeks . Second , baseline severity of cocaine use interacted differently with psychotherapy and pharmacotherapy : higher-severity patients had significantly better outcome when treated with relapse prevention than with clinical management , while desipramine was associated with improved abstinence initiation among lower-severity subjects . Third , desipramine was significantly more effective than placebo in reducing cocaine use over 6 , but not 12 , weeks of treatment . Fourth , depressed subjects had greater reduction in cocaine use than nondepressed subjects and had better response to relapse prevention than to clinical management . CONCLUSION These findings underscore the significance of heterogeneity among cocaine abusers and the need to develop specialized treatments for clinical ly distinct subgroups of cocaine abusers We conducted a pilot study ( N = 22 ) comparing the efficacy of desipramine and amantadine for treatment of cocaine dependence in methadone maintenance clients . The study which lasted 12 weeks , was double-blind , r and omly assigned , and placebo-controlled . Subjects met DSM-III-R criteria for active cocaine dependence . All three groups ' cocaine use , craving , and depressive symptoms declined significantly , but intergroup differences were not significant . Clients receiving desipramine were significantly more likely to remain in treatment and to be cocaine free at study completion . The results emphasize the importance of delivering comprehensive services to the cocaine user in methadone treatment . Further evaluations of these two medications as adjuncts in the treatment of cocaine dependence are needed This is a preliminary report of a double-blind comparison of desipramine or carbamazepine to placebo among subjects participating in an outpatient cocaine treatment program . Sixty-five subjects were r and omly assigned to one of the active drugs or placebo and followed until treatment completion or drop-out to determine if either drug enhanced retention in treatment and /or increased cocaine abstinence . There was no significant difference between carbamazepine or desipramine and placebo on either outcome measure in this preliminary analysis . While this is a preliminary report and does not take into account the heterogeneity of the patients in cocaine treatment , the results are consistent with those of other investigators and suggest that use of desipramine or carbamazepine may not offer any advantage in retaining cocaine-dependent patients in treatment A 12‐week , r and omized , double‐blind , placebo‐controlled , fixed‐dose outpatient study of carbamazepine ( 400 mg and 800 mg ) in the treatment of cocaine dependence was performed . Data were analyzed with respect to both treatment condition and carbamazepine serum levels . Outcome variables included subject retention , cocaine urinalysis , self‐reported cocaine use , cocaine craving , patient and clinical global impressions , the Drug Impairment Rating Scale for Cocaine , and side effects . Compared with placebo , the 400 mg treatment condition exhibited a greater decrease in the rate of positive cocaine urinalyses and a reduction in intensity and duration of craving over the course of the study . Higher serum carbamazepine levels were associated with a lower rate of positive cocaine urinalysis , fewer days of self‐reported cocaine use , briefer craving episodes , and greater subject interval retention . The clinical and method ologic implication s of these findings and of the study design are discussed To evaluate the effectiveness of phenytoin in the treatment of cocaine abuse Twenty-nine cocaine-dependent male veterans without other drug dependence completed a double-blind controlled , r and omly-assigned study examining the efficacy of bromocriptine versus placebo in the management of cocaine abstinence symptomatology . Serum prolactin ( PL ) and growth hormone ( GH ) levels were obtained prior to and after the study was completed . Patients were seen daily and completed several self-report question naires , including the Symptom Checklist-90-Revised , the Beck Depression Inventory , and a Cocaine Craving Report . The patients were also asked to rate a variety of cocaine withdrawal symptoms . Overall , there did not appear to be any advantage to receiving bromocriptine versus placebo during the first 3 weeks following cocaine use cessation with the possible exception of changes in activity and appetite level . The placebo group showed a statistically significant increase in activity level during the first week in treatment and a significant increase in appetite throughout the study . Patients in both groups showed significant improvement in the other areas assessed , with improvement appearing to progress according to length of treatment . Hyperprolactinemia or abnormal GH levels were not found in this patient sample as a group . Thirty-four of the original 63 patients dropped out of the study . Seventeen received bromocriptine , and 17 received placebo . There was no significant difference between drug groups in incidence of retaining patients in treatment . The high dropout rate may reflect the difficulty incurred in retaining cocaine-dependent patients in treatment We report on a double-blind , placebo-controlled study of carbamazepine ( CBZ ) treatment for cocaine dependence . A previously reported uncontrolled study found CBZ to be a beneficial pharmacotherapy for cocaine dependence . Statistical analyses were performed on data from 82 subjects who were r and omized to 10 weeks ' treatment with either CBZ , titrated to 4 - 12 micrograms/ml , ( n = 37 ) or placebo ( n = 45 ) . The two treatment groups did not differ for primary outcome measures of retention time in treatment , urine sample s positive for cocaine metabolite , subject reported desire for cocaine or for subject reported side-effects . CBZ was not an effective treatment in this study On the basis of the dopamine depletion theory , bromocriptine has been tested to treat cocaine withdrawal and dependence . The authors conducted a 6-week study with 1 week of pretreatment observation and 5 weeks of a r and omized , double-blind , placebo-controlled clinical trial of bromocriptine for DSM-III-R-defined cocaine dependence in methadone-maintained male patients . The bromocriptine group ( n = 24 ) did not differ from the placebo group ( n = 26 ) in self-reported cocaine use , proportion of positive urine toxicology sample s , craving for cocaine , resistance to cocaine use , or mood symptoms between the pretreatment baseline and the last week of the clinical trial . Both groups showed significant reduction in self-reported frequency of cocaine use , resistance to craving , and mood symptoms during participation in the protocol . The results of this study are consistent with recent clinical and laboratory findings in primary cocaine users . Despite initially promising pilot studies , recent evidence does not support the efficacy of bromocriptine to reduce cocaine use or craving This controlled study tested the efficacy of the selective serotonergic reuptake inhibitor fluoxetine in the out-patient treatment of primary crack cocaine dependence . Thirty-two subjects were r and omly assigned , 16 in each group , to placebo or fluoxetine , 40 mg/day , in a double-blind controlled trial over a 12-week period . Outcome measures included quantitative urine benzoylecgonine concentration , self-reports of cocaine use and craving , and treatment retention . Subjects assigned to fluoxetine were retained in treatment significantly longer than those on placebo : a median of 11 weeks compared to 3 weeks ( logrank test , P < 0.001 ) . Because of the poor retention in the placebo group , between-groups comparisons of outcome were limited to the first 6 weeks of treatment . No differences in cocaine use or craving were found between the two groups over weeks 1 to 6 . The significant improvement in retention associated with fluoxetine may support further study of this medication in the treatment of cocaine dependence We conducted a double-blind , r and omized clinical trial of mazindol ( n = 37 ) for the prevention of relapse to cocaine abuse in methadone-maintained patients who were in the " action " stage of change , i.e. , had a history of cocaine dependence but who had been abstinent for at least 2 weeks prior to entry into the study . Eight-one percent of subjects completed the 12-week course of treatment . Overall , cocaine use during the study was comparatively low-17 % of the urine screens su bmi tted were positive for cocaine metabolite . Differences between the mazindol and placebo groups of rates of relapse , number of days to relapse , and cocaine use did not reach statistical significance , but were in the direction of a treatment effect . Results suggest that stage of abstinence initiation may be a potentially useful category to employ as an independent variable in future pharmacotherapy trials for the treatment of cocaine addiction in this patient population This double-blind placebo-controlled treatment study tested the efficacy of mazindol in currently cocaine-dependent out- patients . Forty-three patients were r and omized to mazindol ( 2 mg QD ) vs. placebo treatment for 6 weeks . All patients received weekly group counseling . Patients improved with respect to objective ( urine toxicology ) and subjective ( self-report of times used , dollars spent , craving , etc . ) measures . There was no response difference between patients treated with mazindol and those who received placebo AIMS To evaluate disulfiram and three forms of manual guided psychotherapy for individuals with cocaine dependence and concurrent alcohol abuse or dependence . DESIGN R and omized controlled trial . SETTING Urban substance abuse treatment center . PARTICIPANTS One hundred and twenty-two cocaine/alcohol abusers ( 27 % female ; 61 % African-American or Hispanic ) . INTERVENTIONS One of five treatments delivered over 12 weeks : cognitive behavioral treatment ( CBT ) plus disulfiram ; Twelve Step facilitation ( TSF ) plus disulfiram ; clinical management ( CM ) plus disulfiram ; CBT plus no medication ; TSF plus no medication . MEASUREMENTS Duration of continuous abstinence from cocaine or alcohol ; frequency and quantity of cocaine and alcohol use by week , verified by urine toxicology and breathalyzer screens . FINDINGS Disulfiram treatment was associated with significantly better retention in treatment , as well as longer duration of abstinence from alcohol and cocaine use . The two active psychotherapies ( CBT and TSF ) were associated with reduced cocaine use over time compared with supportive psychotherapy ( CM ) . Cocaine and alcohol use were strongly related throughout treatment , particularly for subjects treated with disulfiram . CONCLUSIONS For the large proportion of cocaine-dependent individuals who also abuse alcohol , disulfiram combined with outpatient psychotherapy may be a promising treatment strategy . This study underlines ( a ) the significance of alcohol use among treatment-seeking cocaine abusers , ( b ) the promise of the strategy of treating co-morbid disorders among drug-dependent individuals , and ( c ) the importance of combining psychotherapy and pharmacotherapy in the treatment of drug use disorders Fifty cocaine-dependent patients completed a 2-week double-blind , double-dummy , parallel-group comparison of four dosage levels of diethylpropion and placebo . This clinical trial was design ed to evaluate both diethylpropion 's ability to attenuate cocaine cue-induced craving and its potential for development as a medication with cocaine-agonist properties . The results indicated that diethylpropion was not superior to placebo and confirmed earlier reports that craving for cocaine diminishes over the course of an inpatient hospitalization . Moreover , the results showed that the cocaine cue-induced craving paradigm employed is effective in stimulating craving for cocaine . Medications that are effective in attenuating this type of " conditioned " craving may have relevance to the breaking of the cycle of relapse and long-term treatment of cocaine dependence . Diethylpropion may not be an appropriate c and i date for future medication development because of its lack of obvious therapeutic efficacy and the emergence of a significant number of side effects . However , a cocaine-agonist medication strategy may be appropriate for a subgroup of cocaine-dependent patients with coexisting attention deficit-hyperactivity disorder Based on previous reports that bromocriptine , a postsynaptic dopamine agonist , reduced cocaine craving and prevented relapse in cocaine-dependent subjects , effects of the drug were evaluated in 20 cocaine-dependent males in an inpatient drug rehabilitation programme . The subjective and physiologic effects of exposure to both cocaine-associated and neutral stimuli , presented using videotapes , were measured at one-week intervals . Between laboratory sessions subjects received either bromocriptine ( 1.25 mg bid ) or a matched placebo , administered in double-blind fashion . Compared with the neutral videotape , the cocaine videotape elicited both a greater desire to use cocaine and more symptoms associated with cocaine self-administration . These results support an appetitive conditioning model of cocaine effects . Bromocriptine , however , had no effect on the cocaine-cue-associated reactivity , which declined over the 1-week interval in both treatment groups . Method ological differences among studies that have examined the effects of bromocriptine in cocaine-dependent subjects may explain the variable findings observed Results of pre clinical studies suggest that pergolide , a mixed D(1)/D(2 ) dopamine receptor agonist , may be useful in treating cocaine dependence . To empirically investigate this possibility , we conducted a 5-year , double-blind , placebo-controlled clinical trial of two doses of pergolide ( 0.05 and 0.25 mg bid ) in subjects with cocaine dependence and combined cocaine/alcohol dependence . Data analysis was performed on an intent to treat population ( N=358 ) and a per protocol population ( N=108 ) with urine drug screens ( UDS ) used as the main outcome measure . There were no significant effects on UDS at either pergolide dose . Pergolide had no significant effect on alcohol use in the comorbid alcohol/cocaine dependence group . Pergolide does not appear to have clinical value in the treatment of cocaine dependence or in decreasing alcohol use in cocaine-dependent individuals at the presently studied doses The effectiveness of amantadine hydrochloride was evaluated in a double-blind placebo controlled drug trial . The subjects were 42 cocaine dependent men enrolled in a day hospital program . Twenty-one patients were prescribed 100 mg/bid of amantadine to be taken over 10.5 days and 21 were prescribed an equivalent amount of placebo . The primary outcome measures were the Addiction Severity Index at 1 month after study entry and urines during the drug trial ( end of weeks 1 and 2 ) and 1 month after study entry . Urines obtained at the end of the drug trial ( 2 weeks ) indicated that the subjects receiving amantadine ( 93 % ) were more likely ( P = 0.040 ) to be free of cocaine than the placebo ( 60 % ) subjects . Urine toxicology data at 1-month follow-up again indicated that more of the amantadine subjects ( 83 % ) were free of cocaine than the placebo ( 53 % ) subjects ( p = 0.049 ) ; although no differences were found in self-reports of cocaine or other substance use in the past 30 days . The urine findings provided preliminary indication that amantadine may have some effectiveness in reducing cocaine use in cocaine dependent patients As part of a double-blind placebo-controlled study of the effects of ritanserin on cocaine use and craving , reactivity to cocaine-related events was assessed both before and during medication . Twenty-two patients receiving ritanserin and 23 receiving placebo were exposed to cocaine cues while continuous measures of heart rate , skin temperature , and skin resistance were taken . Self-reports of high , withdrawal , and craving were also collected . The cues produced significant physiological responding as well as significant increases in high and craving during both sessions . Ritanserin reduced cue-elicited decreases in skin temperature , but had no effect on heart rate and skin resistance or on cue-induced high and craving . The results demonstrate that cue reactivity is a robust phenomenon across two assessment sessions but fail to support the use of ritanserin as a means of reducing cue-elicited drug states The objective of this research was to determine the efficacy of enhanced continuity of care and desipramine in increasing treatment attendance and abstinence from cocaine in primary cocaine abusers . Study design was a r and om assignment , placebo-controlled factorial with assessment s at baseline and at 3 ( first week of outpatient treatment ) , 8 , and 12 weeks after start of study . Desipramine blood levels were taken at weeks 2 ( inpatient ) , 3 , and 8 . Subjects ( N = 94 men ) were recruited on an inpatient ward and assigned to increased continuity of care or to st and ard treatment , and to active or placebo drug . Main outcome variables were toxicology-verified reports of cocaine use , and attendance at counseling sessions . Enhanced continuity of care increased abstinence from cocaine at week 3 and increased attendance at individual counseling sessions throughout the 12 weeks of the study . There were no main effects for desipramine . Blood levels above 123 ng/ml at week 2 predicted longer stays in outpatient . We conclude that enhanced continuity of care is a low cost intervention that improves early treatment outcome and attendance ; desipramine effects do not warrant its therapeutic use We conducted a double-blind , r and om assignment , six-week comparison of desipramine hydrochloride ( n = 24 ) , lithium carbonate ( n = 24 ) , and placebo ( n = 24 ) treatments for cocaine dependence . Subjects were 72 outpatient cocaine abusers who met DSM-III-R dependence criteria for cocaine but not for other substance abuse . Subjects in each treatment group were similar in history of cocaine and other substance abuse , cocaine craving , sociodemographics , and other psychiatric comorbidity . Desipramine , compared with both other treatments , substantially decreased cocaine use . Lithium treatment outcome did not differ from that of placebo . Desipramine-treated subjects attained contiguous periods of abstinence substantially more frequently than subjects receiving lithium or placebo . Fifty-nine percent of the desipramine-treated subjects were abstinent for at least three to four consecutive weeks during the six-week study period , compared with 17 % for placebo and 25 % for lithium . Cocaine craving reductions were also substantially greater in the desipramine-treated subjects . Establishment of initial abstinence is the first stage in recovery from cocaine dependence . Our findings indicate that desipramine is an effective general treatment , for this first treatment stage , in actively cocaine-dependent out patients The effects of bromocriptine and amantadine in treating cocaine withdrawal were compared . Withdrawal symptoms are thought to be due to central dopamine depletion . Both bromocriptine and amantadine are dopamine agonists previously reported to diminish withdrawal symptoms . Thirty subjects were withdrawn for 30 days with amantadine , bromocriptine , or placebo . Bromocriptine and amantadine were more effective than placebo for 15 days . Amantadine 's effectiveness then declined so that it was no more effective than placebo by experiment 's end . Bromocriptine was significantly more effective than both throughout the latter phase of the study . Amantadine 's decline in effectiveness is hypothesized to be due to stimulation of dopamine release
11,056
28,446,499
A clear indication of nonlinearity was seen for the relations between vegetables , fruits , nuts , and dairy and all-cause mortality . Optimal consumption of risk-decreasing foods results in a 56 % reduction of all-cause mortality , whereas consumption of risk-increasing foods is associated with a 2-fold increased risk of all-cause mortality . Conclusion : Selecting specific optimal intakes of the investigated food groups can lead to a considerable change in the risk of premature death
Background : Suboptimal diet is one of the most important factors in preventing early death and disability worldwide . Objective : The aim of this meta- analysis was to synthesize the knowledge about the relation between intake of 12 major food groups , including whole grains , refined grains , vegetables , fruits , nuts , legumes , eggs , dairy , fish , red meat , processed meat , and sugar-sweetened beverages , with risk of all-cause mortality .
Background It is estimated that disease burden due to low fruit and vegetable consumption is higher in Central and Eastern Europe ( CEE ) and the former Soviet Union ( FSU ) than any other parts of the world . However , no large scale studies have investigated the association between fruit and vegetable ( F&V ) intake and mortality in these regions yet . Design The Health , Alcohol and Psychosocial Factors in Eastern Europe ( HAPIEE ) study is a prospect i ve cohort study with participants recruited from the Czech Republic , Pol and and Russia . Methods Dietary data was collected using food frequency question naire . Mortality data was ascertained through linkage with death registers . Multivariable adjusted hazard ratios were calculated by Cox regression models . Results Among 19,333 disease-free participants at baseline , 1314 died over the mean follow-up of 7.1 years . After multivariable adjustment , we found statistically significant inverse association between cohort-specific quartiles of F&V intake and stroke mortality : the highest vs lowest quartile hazard ratio ( HR ) was 0.52 ( 95 % confidence interval ( CI ) : 0.28–0.98 ) . For total mortality , significant interaction ( p = 0.008 ) between F&V intake and smoking was found . The associations were statistically significant in smokers , with HR 0.70 ( 0.53–0.91 , p for trend : 0.011 ) for total mortality , and 0.62 ( 0.40–0.97 , p for trend : 0.037 ) for cardiovascular disease ( CVD ) mortality . The association was appeared to be mediated by blood pressure , and F&V intake explained a considerable proportion of the mortality differences between the Czech and Russian cohorts . Conclusions Our results suggest that increasing F&V intake may reduce CVD mortality in CEE and FSU , particularly among smokers and hypertensive individuals BACKGROUND Red meat consumption has been associated with an increased risk of chronic diseases . However , its relationship with mortality remains uncertain . METHODS We prospect ively observed 37 698 men from the Health Professionals Follow-up Study ( 1986 - 2008 ) and 83 644 women from the Nurses ' Health Study ( 1980 - 2008 ) who were free of cardiovascular disease ( CVD ) and cancer at baseline . Diet was assessed by vali date d food frequency question naires and up date d every 4 years . RESULTS We documented 23 926 deaths ( including 5910 CVD and 9464 cancer deaths ) during 2.96 million person-years of follow-up . After multivariate adjustment for major lifestyle and dietary risk factors , the pooled hazard ratio ( HR ) ( 95 % CI ) of total mortality for a 1-serving-per-day increase was 1.13 ( 1.07 - 1.20 ) for unprocessed red meat and 1.20 ( 1.15 - 1.24 ) for processed red meat . The corresponding HRs ( 95 % CIs ) were 1.18 ( 1.13 - 1.23 ) and 1.21 ( 1.13 - 1.31 ) for CVD mortality and 1.10 ( 1.06 - 1.14 ) and 1.16 ( 1.09 - 1.23 ) for cancer mortality . We estimated that substitutions of 1 serving per day of other foods ( including fish , poultry , nuts , legumes , low-fat dairy , and whole grains ) for 1 serving per day of red meat were associated with a 7 % to 19 % lower mortality risk . We also estimated that 9.3 % of deaths in men and 7.6 % in women in these cohorts could be prevented at the end of follow-up if all the individuals consumed fewer than 0.5 servings per day ( approximately 42 g/d ) of red meat . CONCLUSIONS Red meat consumption is associated with an increased risk of total , CVD , and cancer mortality . Substitution of other healthy protein sources for red meat is associated with a lower mortality risk Despite a proposed protective effect of fish intake on the risk of cardiovascular disease , epidemiologic evidence on fish intake and mortality is inconsistent . We investigated associations of fish intake , assessed through a vali date d food frequency question naire , with risks of total and cause-specific mortality in 2 prospect i ve cohort studies of 134,296 Chinese men and women ( 1997 - 2009 ) . Vital status and date and cause of death were ascertained through annual linkage to the Shanghai Vital Statistics Registry data base and biennial home visits . Cox regression was used to calculate hazard ratios and corresponding 95 % confidence intervals . After excluding the first year of observation , the analysis included 3,666 deaths among women and 2,170 deaths among men . Fish intake was inversely associated with risks of total , ischemic stroke , and diabetes mortality ; the corresponding hazard ratios for the highest quintiles of intake compared with the lowest were 0.84 ( 95 % confidence interval ( CI ) : 0.76 , 0.92 ) , 0.63 ( 95 % CI : 0.41 , 0.94 ) , and 0.61 ( 95 % CI : 0.39 , 0.95 ) , respectively . No associations with cancer or ischemic heart disease mortality were observed . Further analyses suggested that the inverse associations with total , ischemic stroke , and diabetes mortality were primarily related to consumption of saltwater fish and intake of long-chain n-3 fatty acids . Overall , our findings support the postulated health benefits of fish consumption In the last 20 years , many prospect i ve cohort studies have assessed the relationships between food consumption and mortality . Result interpretation is mainly hindered by the limited adjustment for confounders and , to a lesser extent , the small sample sizes . The aim of this study was to investigate the association between dietary habits and all-cause mortality in a multicentre prospect i ve cohort that included non-institutionalised , community-based elderly individuals ( Three-City Study ) . A brief FFQ was administered at baseline . Hazard ratios ( HR ) and 95 % CI for all-cause mortality were estimated relative to the consumption frequency of several food groups , using Cox proportional hazards models adjusted for sex , centre , socio-demographic characteristics and health status indicators . Among the 8937 participants ( mean age : 74·2 years , 60·7 % women ) , 2016 deaths were recorded during an average follow-up of 9 years . The risk of death was significantly lower among subjects with the highest fruit and vegetable consumption ( HR 0·90 ; 95 % CI 0·82 , 0·99 , P=0·03 ) and with regular fish consumption ( HR 0·89 ; 95 % CI 0·81 , 0·97 , P=0·01 ) . The benefit of olive oil use was found only in women ( moderate olive oil use : HR 0·80 ; 95 % CI 0·68 , 0·94 , P=0·007 ; intensive use : HR 0·72 ; 95 % CI 0·60 , 0·85 , P=0·0002 ) . Conversely , daily meat consumption increased the mortality risk ( HR 1·12 ; 95 % CI , 1·01 , 1·24 , P=0·03 ) . No association was found between risk of death and diet diversity and use of various fats . These findings suggest that fruits/vegetables , olive oil and regular fish consumptions have a beneficial effect on the risk of death , independently of the socio-demographic features and the number of medical conditions Background A healthy diet , as defined by the US Dietary Guidelines for Americans ( DGA ) , has been associated with lower morbidity and mortality from major chronic diseases in studies conducted in predominantly non-Hispanic white individuals . It is unknown whether this association can be extrapolated to African-Americans and low-income population s. Methods and Findings We examined the associations of adherence to the DGA with total and cause-specific mortality in the Southern Community Cohort Study , a prospect i ve study that recruited 84,735 American adults , aged 40–79 y , from 12 southeastern US states during 2002–2009 , mostly through community health centers that serve low-income population s. The present analysis included 50,434 African-Americans , 24,054 white individuals , and 3,084 individuals of other racial/ethnic groups , among whom 42,759 participants had an annual household income less than US$ 15,000 . Usual dietary intakes were assessed using a vali date d food frequency question naire at baseline . Adherence to the DGA was measured by the Healthy Eating Index ( HEI ) , 2010 and 2005 editions ( HEI-2010 and HEI-2005 , respectively ) . During a mean follow-up of 6.2 y , 6,906 deaths were identified , including 2,244 from cardiovascular disease , 1,794 from cancer , and 2,550 from other diseases . A higher HEI-2010 score was associated with lower risks of disease death , with adjusted hazard ratios ( HRs ) of 0.80 ( 95 % CI , 0.73–0.86 ) for all-disease mortality , 0.81 ( 95 % CI , 0.70–0.94 ) for cardiovascular disease mortality , 0.81 ( 95 % CI , 0.69–0.95 ) for cancer mortality , and 0.77 ( 95 % CI , 0.67–0.88 ) for other disease mortality , when comparing the highest quintile with the lowest ( all p-values for trend < 0.05 ) . Similar inverse associations between HEI-2010 score and mortality were observed regardless of sex , race , and income ( all p-values for interaction > 0.50 ) . Several component scores in the HEI-2010 , including whole grains , dairy , seafood and plant proteins , and ratio of unsaturated to saturated fatty acids , showed significant inverse associations with total mortality . HEI-2005 score was also associated with lower disease mortality , with a HR of 0.86 ( 95 % CI , 0.79–0.93 ) when comparing extreme quintiles . Given the observational study design , however , residual confounding can not be completely ruled out . In addition , future studies are needed to evaluate the generalizability of these findings to African-Americans of other socioeconomic status . Conclusions Our results showed , to our knowledge for the first time , that adherence to the DGA was associated with lower total and cause-specific mortality in a low-income population , including a large proportion of African-Americans , living in the southeastern US BACKGROUND Because egg yolk has a high cholesterol concentration , limited egg consumption is often suggested to help prevent ischemic heart disease ( IHD ) . OBJECTIVE We epidemiologically examined the validity of this recommendation . DESIGN We analyzed the relations of egg consumption to serum cholesterol and cause-specific and all-cause mortality by using the NIPPON DATA 80 ( National Integrated Project for Prospect i ve Observation of Non-communicable Disease And its Trends in the Aged , 1980 ) data base . At the baseline examination in 1980 , a nutritional survey was performed by using the food-frequency method in Japanese subjects aged > or = 30 y. We followed 5186 women and 4077 men for 14 y. RESULTS The subjects were categorized into 5 egg consumption groups on the basis of their responses to a question naire ( > or = 2/d , 1/d , 1/2 d , 1 - 2/wk , and seldom ) . There were 69 , 1396 , 1667 , 1742 , and 315 women in each of the 5 groups , respectively . Age-adjusted total cholesterol ( 5.21 , 5.04 , 4.95 , 4.91 , and 4.92 mmol/L in the 5 egg consumption categories , respectively ) was related to egg consumption ( P < 0.0001 , analysis of covariance ) . In women , unadjusted IHD mortality and all-cause mortality differed significantly between the groups [ IHD mortality : 1.1 , 0.5 , 0.4 , 0.5 , and 2.0 per 1000 person-years , respectively ( P = 0.008 , chi-square test ) ; all-cause mortality : 14.8 , 8.0 , 7.5 , 7.5 , and 14.5 per 1000 person-years , respectively ( P < 0.0001 , chi-square test ) ] . In men , egg consumption was not related to age-adjusted total cholesterol . Cox analysis found that , in women , all-cause mortality in the 1 - 2-eggs/wk group was significantly lower than that in the 1-egg/d group , whereas no such relations were noted in men . CONCLUSION Limiting egg consumption may have some health benefits , at least in women in geographic areas where egg consumption makes a relatively large contribution to total dietary cholesterol intake BACKGROUND Prospect i ve studies suggested that substituting whole-grain products for refined-grain products lowers the risks of type 2 diabetes and cardiovascular disease ( CVD ) in women . Although breakfast cereals are a major source of whole and refined grains , little is known about their direct association with the risk of premature mortality . OBJECTIVE We prospect ively evaluated the association between whole- and refined-grain breakfast cereal intakes and total and CVD-specific mortality in a cohort of US men . DESIGN We examined 86,190 US male physicians aged 40 - 84 y in 1982 who were free of known CVD and cancer at baseline . RESULTS During 5.5 y , we documented 3114 deaths from all causes , including 1381 due to CVD ( 488 myocardial infa rct ions and 146 strokes ) . Whole-grain breakfast cereal intake was inversely associated with total and CVD-specific mortality , independent of age ; body mass index ; smoking ; alcohol intake ; physical activity ; history of diabetes , hypertension , or high cholesterol ; and use of multivitamins . Compared with men who rarely or never consumed whole-grain cereal , men in the highest category of whole-grain cereal intake ( > or = 1 serving/d ) had multivariate-estimated relative risks of total and CVD-specific mortality of 0.83 ( 95 % CI : 0.73 , 0.94 ; P for trend < 0.001 ) and 0.80 ( 0.66 , 0.97 ; P for trend < 0.001 ) , respectively . In contrast , total and refined-grain breakfast cereal intakes were not significantly associated with total and CVD-specific mortality . These findings persisted in analyses stratified by history of type 2 diabetes , hypertension , and high cholesterol . CONCLUSIONS Both total mortality and CVD-specific mortality were inversely associated with whole-grain but not refined-grain breakfast cereal intake . These prospect i ve data highlight the importance of distinguishing whole-grain from refined-grain cereals in the prevention of chronic diseases BACKGROUND A reduction in dietary cholesterol is recommended to prevent cardiovascular disease ( CVD ) . Although eggs are important sources of cholesterol and other nutrients , limited and inconsistent data are available on the effects of egg consumption on the risk of CVD and mortality . OBJECTIVE We aim ed to examine the association between egg consumption and the risk of CVD and mortality . DESIGN In a prospect i ve cohort study of 21,327 participants from Physicians ' Health Study I , egg consumption was assessed with an abbreviated food question naire . Cox regression was used to estimate relative risks . RESULTS In an average follow-up of 20 y , 1550 new myocardial infa rct ions ( MIs ) , 1342 incident strokes , and 5169 deaths occurred . Egg consumption was not associated with incident MI or stroke in a multivariate Cox regression . In contrast , adjusted hazard ratios ( 95 % CI ) for mortality were 1.0 ( reference ) , 0.94 ( 0.87 , 1.02 ) , 1.03 ( 0.95 , 1.11 ) , 1.05 ( 0.93 , 1.19 ) , and 1.23 ( 1.11 , 1.36 ) for the consumption of < 1 , 1 , 2 - 4 , 5 - 6 , and > or = 7 eggs/wk , respectively ( P for trend < 0.0001 ) . This association was stronger among diabetic subjects , in whom the risk of death in a comparison of the highest with the lowest category of egg consumption was twofold ( hazard ratio : 2.01 ; 95 % CI : 1.26 , 3.20 ; P for interaction = 0.09 ) . CONCLUSIONS Infrequent egg consumption does not seem to influence the risk of CVD in male physicians . In addition , egg consumption was positively related to mortality , more strongly so in diabetic subjects , in the study population OBJECTIVE To evaluate associations of fast-food items ( FFI ) and sugar-sweetened drinks ( SSD ) with mortality outcomes including deaths due to any cause , CVD and total cancers among a large sample of adults . DESIGN Using a prospect i ve design , risk of death was compared across baseline dietary exposures . Intakes of FFI and SSD were quantified using a semi-quantitative FFQ ( baseline data collected 2000 - 2002 ) . Deaths ( n 4187 ) were obtained via the Washington State death file through 2008 , excluding deaths in the first year of follow-up . Causes of death were categorized as due to CVD ( I00-I99 ) or cancer ( C00-D48 ) . Cox models were used to estimated hazard ratios ( HR ) and 95 % CI . SETTING The Vitamins and Lifestyle ( VITAL ) study among adults living in Western Washington State . SUBJECTS Men and women ( n 69 582 ) between 50 and 76 years of age at baseline . RESULTS Intakes of FFI and SSD were higher among individuals who were younger , female , African-American , American Indian or Alaska Native , Asian-American or Pacific Isl and er , of lower educational attainment , and of lower income ( P<0·0001 for all ) . Higher risk of total mortality was associated with greater intake of FFI ( HR=1·16 ; 95 % CI 1·04 , 1·29 ; P=0·004 ; comparing highest v. lowest quartile ) and SSD ( HR=1·19 ; 95 % CI 1·08 , 1·30 ; P<0·0001 ; comparing highest v. lowest quartile ) . Higher intake of FFI was associated with greater cancer-specific mortality while an association with CVD-specific mortality was suggested . Associations between intake of SSD and cause-specific mortality were less clear . CONCLUSIONS Intake of FFI and SSD has a detrimental effect on future mortality risk . These findings may be salient to socially patterned disparities in mortality Background Governments worldwide recommend daily consumption of fruit and vegetables . We examine whether this benefits health in the general population of Engl and . Methods Cox regression was used to estimate HRs and 95 % CI for an association between fruit and vegetable consumption and all-cause , cancer and cardiovascular mortality , adjusting for age , sex , social class , education , BMI , alcohol consumption and physical activity , in 65 226 participants aged 35 + years in the 2001–2008 Health Surveys for Engl and , annual surveys of nationally representative r and om sample s of the non-institutionalised population of Engl and linked to mortality data ( median follow-up : 7.7 years ) . Results Fruit and vegetable consumption was associated with decreased all-cause mortality ( adjusted HR for 7 + portions 0.67 ( 95 % CI 0.58 to 0.78 ) , reference category < 1 portion ) . This association was more pronounced when excluding deaths within a year of baseline ( 0.58 ( 0.46 to 0.71 ) ) . Fruit and vegetable consumption was associated with reduced cancer ( 0.75 ( 0.59–0.96 ) ) and cardiovascular mortality ( 0.69 ( 0.53 to 0.88 ) ) . Vegetables may have a stronger association with mortality than fruit ( HR for 2 to 3 portions 0.81 ( 0.73 to 0.89 ) and 0.90 ( 0.82 to 0.98 ) , respectively ) . Consumption of vegetables ( 0.85 ( 0.81 to 0.89 ) per portion ) or salad ( 0.87 ( 0.82 to 0.92 ) per portion ) were most protective , while frozen/canned fruit consumption was apparently associated with increased mortality ( 1.17 ( 1.07 to 1.28 ) per portion ) . Conclusions A robust inverse association exists between fruit and vegetable consumption and mortality , with benefits seen in up to 7 + portions daily . Further investigations into the effects of different types of fruit and vegetables are warranted Background A number of prospect i ve studies have observed inverse associations between nut consumption and chronic diseases . However , these studies have predominantly been conducted in Western countries , where nut consumption tends to be more common among individuals with healthier lifestyles . It is important to examine the association in other parts of the world , and particularly among population s with different patterns of disease , socioeconomic status , lifestyles and disease risk factors . Our objective was to examine the association between nut consumption and mortality in a population whose nut consumption does not track with a healthy lifestyle . Methods We examined the association between nut consumption and all-cause and cause-specific mortality in the 50 045 participants of the Golestan Cohort Study . Participants were aged 40 and older at baseline in 2004 , and have been actively followed since that time . Dietary data were collected using a vali date d semi-quantitative food frequency question naire that was administered at baseline . Results During 349 677 person-years of follow-up , 3981 cohort participants died , including 1732 women and 2249 men . Nut consumption was associated inversely with all-cause mortality . The pooled multivariate adjusted hazard ratios for death among participants who ate nuts , as compared with those who did not , were 0.89 [ 95 % confidence interval ( CI ) , 0.82 - 0.95 ] for the consumption of less than one serving of nuts per week , 0.75 ( 95 % CI , 0.67 - 0.85 ) for one to less than three servings per week and 0.71 ( 95 % CI , 0.58 - 0.86 ) for three or more servings per week ( P < 0.001 for trend ) . Among specific causes , significant inverse associations were observed between nut consumption and deaths due to cardiovascular disease , all cancers and gastrointestinal cancers . Conclusions This study provides evidence for an inverse association between nut consumption and mortality in a developing country , where nut consumption does not track with a healthy lifestyle . Further work is needed to establish the underlying mechanisms responsible for this association Background Recently , some US cohorts have shown a moderate association between red and processed meat consumption and mortality supporting the results of previous studies among vegetarians . The aim of this study was to examine the association of red meat , processed meat , and poultry consumption with the risk of early death in the European Prospect i ve Investigation into Cancer and Nutrition ( EPIC ) . Methods Included in the analysis were 448,568 men and women without prevalent cancer , stroke , or myocardial infa rct ion , and with complete information on diet , smoking , physical activity and body mass index , who were between 35 and 69 years old at baseline . Cox proportional hazards regression was used to examine the association of meat consumption with all-cause and cause-specific mortality . Results As of June 2009 , 26,344 deaths were observed . After multivariate adjustment , a high consumption of red meat was related to higher all-cause mortality ( hazard ratio ( HR ) = 1.14 , 95 % confidence interval ( CI ) 1.01 to 1.28 , 160 + versus 10 to 19.9 g/day ) , and the association was stronger for processed meat ( HR = 1.44 , 95 % CI 1.24 to 1.66 , 160 + versus 10 to 19.9 g/day ) . After correction for measurement error , higher all-cause mortality remained significant only for processed meat ( HR = 1.18 , 95 % CI 1.11 to 1.25 , per 50 g/d ) . We estimated that 3.3 % ( 95 % CI 1.5 % to 5.0 % ) of deaths could be prevented if all participants had a processed meat consumption of less than 20 g/day . Significant associations with processed meat intake were observed for cardiovascular diseases , cancer , and ' other causes of death ' . The consumption of poultry was not related to all-cause mortality . Conclusions The results of our analysis support a moderate positive association between processed meat consumption and mortality , in particular due to cardiovascular diseases , but also to cancer Objective To investigate the relative importance of the individual components of the Mediterranean diet in generating the inverse association of increased adherence to this diet and overall mortality . Design Prospect i ve cohort study . Setting Greek segment of the European Prospect i ve Investigation into Cancer and nutrition ( EPIC ) . Participants 23 349 men and women , not previously diagnosed with cancer , coronary heart disease , or diabetes , with documented survival status until June 2008 and complete information on nutritional variables and important covariates at enrolment . Main outcome measure All cause mortality . Results After a mean follow-up of 8.5 years , 652 deaths from any cause had occurred among 12 694 participants with Mediterranean diet scores 0 - 4 and 423 among 10 655 participants with scores of 5 or more . Controlling for potential confounders , higher adherence to a Mediterranean diet was associated with a statistically significant reduction in total mortality ( adjusted mortality ratio per two unit increase in score 0.864 , 95 % confidence interval 0.802 to 0.932 ) . The contributions of the individual components of the Mediterranean diet to this association were moderate ethanol consumption 23.5 % , low consumption of meat and meat products 16.6 % , high vegetable consumption 16.2 % , high fruit and nut consumption 11.2 % , high monounsaturated to saturated lipid ratio 10.6 % , and high legume consumption 9.7 % . The contributions of high cereal consumption and low dairy consumption were minimal , whereas high fish and seafood consumption was associated with a non-significant increase in mortality ratio . Conclusion The dominant components of the Mediterranean diet score as a predictor of lower mortality are moderate consumption of ethanol , low consumption of meat and meat products , and high consumption of vegetables , fruits and nuts , olive oil , and legumes . Minimal contributions were found for cereals and dairy products , possibly because they are heterogeneous categories of foods with differential health effects , and for fish and seafood , the intake of which is low in this population Abstract Fish is a source of important nutrients and may play a role in preventing heart diseases and other health outcomes . However , studies of overall mortality and cause-specific mortality related to fish consumption are inconclusive . We examined the rate of overall mortality , as well as mortality from ischaemic heart disease and cancer in relation to the intake of total fish , lean fish , and fatty fish in a large prospect i ve cohort including ten European countries . More than 500,000 men and women completed a dietary question naire in 1992–1999 and were followed up for mortality until the end of 2010 . 32,587 persons were reported dead since enrolment . Hazard ratios and their 99 % confidence interval were estimated using Cox proportional hazard regression models . Fish consumption was examined using quintiles based on reported consumption , using moderate fish consumption ( third quintile ) as reference , and as continuous variables , using increments of 10 g/day . All analyses were adjusted for possible confounders . No association was seen for fish consumption and overall or cause-specific mortality for both the categorical and the continuous analyses , but there seemed to be a U-shaped trend ( p < 0.000 ) with fatty fish consumption and total mortality and with total fish consumption and cancer mortality ( p = 0.046 ) Purpose Mediterranean-type dietary pattern has been associated with lower risk of cardiovascular ( CVD ) and other chronic diseases , primarily in Southern European population s. We examined whether Mediterranean diet score ( MDS ) is associated with total , CVD , coronary heart disease ( CHD ) and stroke mortality in a prospect i ve cohort study in three Eastern European population s. Methods A total of 19,333 male and female participants of the Health Alcohol and Psychosocial factors in Eastern Europe ( HAPIEE ) study in the Czech Republic , Pol and and the Russian Federation were included in the analysis . Diet was assessed by food frequency question naire , and MDS was derived from consumption of nine groups of food using absolute cut-offs . Mortality was ascertained by linkage with death registers . Results Over the median follow-up time of 7 years , 1314 participants died . The proportion of participants with high adherence to Mediterranean diet was low ( 25 % ) . One st and ard deviation ( SD ) increase in the MDS ( equivalent to 2.2 point increase in the score ) was found to be inversely associated with death from all causes ( HR , 95 % CI 0.93 , 0.88–0.98 ) and CVD ( 0.90 , 0.81–0.99 ) even after multivariable adjustment . Inverse but statistically not significant link was found for CHD ( 0.90 , 0.78–1.03 ) and stroke ( 0.87 , 0.71–1.07 ) . The MDS effects were similar in each country cohort . Conclusion Higher adherence to the Mediterranean diet was associated with reduced risk of total and CVD deaths in these large Eastern European urban population s. The application of MDS with absolute cut-offs appears suitable for non-Mediterranean population Background There is growing evidence for a relationship between fruit and vegetable consumption and all-cause mortality . Few studies , however , specifically explored consuming raw versus cooked vegetables in relation to health and mortality outcomes . The purpose of this study was to examine the relation of all-cause mortality with : a ) fruit and vegetable consumption , either combined or separately ; b ) the consumption of raw versus cooked vegetables in a large cohort of Australian middle-aged and older adults . Methods The sample included 150,969 adults aged 45 years and over from the 45 and Up Study , a prospect i ve cohort study conducted in New South Wales , Australia . Self-reported baseline question naire data ( 2006–09 ) were linked to mortality data up to June 2014 . Fruit and vegetable consumption was assessed by vali date d short questions . Crude and adjusted hazard ratios were calculated using Cox proportional hazard models . Covariates included socio-demographic characteristics , health-related and dietary variables . Results During a mean follow-up of 6.2 years , 6038 ( 4 % ) participants died from all causes . In the fully adjusted models , increasing consumption of fruit and vegetables combined was associated with reductions in all-cause mortality , with the highest risk reduction seen up to 7 serves/day or more of fruit and vegetables ( P for trend = 0.002 , hazard ratio for highest versus lowest consumption quartile : 0.90 ; 95 % confidence interval : 0.84 , 0.97 ) . Separate consumption of fruit and vegetables , as well as consumption of raw or cooked vegetables , were associated with a reduced risk of all-cause mortality in the crude and minimally adjusted models ( all P for trend < 0.05 ) . With the exception of raw vegetables , these associations remained significant in the fully adjusted models ( all P for trend < 0.05 ) . Age and sex were significant effect modifiers of the association between fruit and vegetable consumption and all-cause mortality . Conclusions Fruit and vegetable consumption were inversely related to all-cause mortality in this large Australian cohort . Further studies examining the effects of raw versus cooked vegetables are needed OBJECTIVE The aim of this study was to assess the association between nut consumption and all-cause mortality after 5-y follow-up in a Spanish cohort . METHODS The SUN ( Seguimiento Universidad de Navarra , University of Navarra Follow-up ) project is a prospect i ve cohort study , formed by Spanish university graduates . Information is gathered by mailed question naires collected biennially . In all , 17 184 participants were followed for up to 5 y. Baseline nut consumption was collected by self-reported data , using a vali date d 136-item semi-quantitative food frequency question naire . Information on mortality was collected by permanent contact with the SUN participants and their families , postal authorities , and the National Death Index . The association between baseline nut consumption and all-cause mortality was assessed using Cox proportional hazards models to adjust for potential confounding . Baseline nut consumption was categorized in two ways . In a first analysis energy-adjusted quintiles of nut consumption ( measured in g/d ) were used . To adjust for total energy intake the residuals method was used . In a second analysis , participants were categorized into four groups according to pre-established categories of nut consumption ( servings/d or servings/wk ) . Both analyses were adjusted for potential confounding factors . RESULTS Participants who consumed nuts ≥2/wk had a 56 % lower risk for all-cause mortality than those who never or almost never consumed nuts ( adjusted hazard ratio , 0.44 ; 95 % confidence intervals , 0.23 - 0.86 ) . CONCLUSION Nut consumption was significantly associated with a reduced risk for all-cause mortality after the first 5 y of follow-up in the SUN project Abstract The purpose of this epidemiological study was to estimate mortality risk associated with poor diet quality ( consumption of five food groups ) , extremes of body mass index ( BMI ) , waist circumference , and impaired food-related activities of daily living among community-dwelling older Black and White men and women . The design of the current study was a retrospective- prospect i ve cohort study . The sample included residents ( n = 1920 ) of five North Carolina Piedmont counties . The dependent variable was four-year all-cause mortality . Analyses were stratified by gender and race , and controlled covariates included : age , living with others , income , smoking and alcohol use , cognitive status , and overall self-rated health . Data were self-reported to interviewers , except BMI and waist , which were measured by trained interviewers . Difficulty in fixing meals elevated the risk of mortality between 2.7 and 6.5 times across the four gender-race groups . Among older adults , inability to fix a meal conferred more risk of mortality than did lack of financial means . Adequate servings of vegetables were uniformly protective , although significant only among Black males . Neither BMI nor waist circumference conferred significant mortality risk . These population -based findings suggest relationships between nutrition risk factors and mortality that are unique and require further focused studies Between 1986 and 1989 , 18,244 men aged 45 - 64 years in Shanghai , China , participated in a prospect i ve study of diet and cancer . All participants completed an in-person , structured interview and provided blood and urine sample s. As of September 1 , 1998 , 113 deaths from acute myocardial infa rct ion were identified . After analyses were adjusted for age , total energy intake , and known cardiovascular disease risk factors , men who consumed > or=200 g of fish/shellfish per week had a relative risk of 0.41 ( 95 % confidence interval : 0.22 , 0.78 ) for fatal acute myocardial infa rct ion compared with men consuming < 50 g per week . Similarly , dietary intake of n-3 fatty acids derived from seafood also was significantly associated with reduced mortality from myocardial infa rct ion . Neither dietary seafood nor n-3 fatty acid intake was associated with a reduced risk of death from stroke or ischemic heart disease other than acute myocardial infa rct ion . However , approximately a 20 % reduction in total mortality associated with weekly fish/shellfish intake was observed in the study population ( relative risk = 0.79 , 95 % confidence interval : 0.69 , 0.91 ) . These prospect i ve data suggest that eating fish and shellfish weekly reduces the risk of fatal myocardial infa rct ion in middle-aged and older men in Shanghai , China Although diet quality is implicated in cardiovascular disease ( CVD ) risk , few studies have investigated the relation between diet quality and the risks of CVD and mortality in older adults . This study examined the prospect i ve associations between dietary scores and risk of CVD and all-cause mortality in older British men . A total of 3328 men ( aged 60–79 y ) from the British Regional Heart Study , free from CVD at baseline , were followed up for 11.3 y for CVD and mortality . Baseline food-frequency question naire data were used to generate 2 dietary scores : the Healthy Diet Indicator ( HDI ) , based on WHO dietary guidelines , and the Elderly Dietary Index ( EDI ) , based on a Mediterranean-style dietary intake , with higher scores indicating greater compliance with dietary recommendations . Cox proportional hazards regression analyses assessed associations between quartiles of HDI and EDI and risk of all-cause mortality , CVD mortality , CVD events , and coronary heart disease ( CHD ) events . During follow-up , 933 deaths , 327 CVD deaths , 582 CVD events , and 307 CHD events occurred . Men in the highest compared with the lowest EDI quartile had significantly lower risks of all-cause mortality ( HR : 0.75 ; 95 % CI : 0.60 , 0.94 ; P-trend = 0.03 ) , CVD mortality ( HR : 0.63 ; 95 % CI : 0.42 , 0.94 ; P-trend = 0.03 ) , and CHD events ( HR : 0.66 ; 95 % CI : 0.45 , 0.97 ; P-trend = 0.05 ) but not CVD events ( HR : 0.79 ; 95 % CI : 0.60 , 1.05 ; P-trend = 0.16 ) after adjustment for sociodemographic , behavioral , and cardiovascular risk factors . The HDI was not significantly associated with any of the outcomes . The EDI appears to be more useful than the HDI for assessing diet quality in relation to CVD and morality risk in older men . Encouraging older adults to adhere to the guidelines inherent in the EDI criteria may have public health benefits OBJECTIVE To examine the effects of non-alcoholic beverage and caffeine consumption on all-cause mortality in older adults . METHODS The Leisure World Cohort Study is a prospect i ve study of residents of a California retirement community . A baseline postal health survey included details on coffee , tea , milk , soft drink , and chocolate consumption . Participants were followed for 23 years ( 1981 - 2004 ) . Risk ratios ( RRs ) of death were calculated using Cox regression for 8644 women and 4980 men ( median age at entry , 74 years ) and adjusted for age , gender , and multiple potential confounders . RESULTS Caffeine consumption exhibited a U-shaped mortality curve . Moderate caffeine consumers had a significantly reduced risk of death ( multivariable-adjusted RR=0.94 , 95 % CI : 0.89 , 0.99 for 100 - 199 mg/day and RR=0.90 , 95 % CI : 0.85 , 0.94 for 200 - 399 mg/day compared with those consuming < 50 mg/day ) . Individuals who drank more than 1 can/week of artificially sweetened ( but not sugar-sweetened ) soft drink ( cola and other ) had an 8 % increased risk ( 95 % CI : 1.01 - 1.16 ) . Neither milk nor tea had a significant effect on mortality after multivariable adjustment . CONCLUSIONS Moderate caffeine consumption appeared beneficial in reducing risk of death . Attenuation in the observed associations between mortality and intake of tea and milk with adjustment for potential confounders suggests that such consumption identifies those with other mortality-associated lifestyle and health risks . The increased death risk with consumption of artificially sweetened , but not sugar-sweetened , soft drinks suggests an effect of the sweetener rather than other components of the soft drinks , although residual confounding remains a possibility Although previous studies have shown that dietary consumption of certain food groups is associated with a lower risk of cancer , heart disease and stroke mortality in western population s , limited prospect i ve data are available from China . We prospect ively examined the association between dietary intake of different food groups at baseline and risk of total , cancer , heart disease and stroke mortality outcomes in the Linxian Nutrition Intervention Trials(NIT ) cohort . In 1984–1991 , 2445 subjects aged 40–69 years from the Linxian NIT cohort completed a food frequency question naire . Deaths from esophageal and gastric cancer , heart disease and stroke were identified through up to 26 years of follow-up . We used Cox proportional hazard models to calculate hazard ratios and 95 % confidence intervals for associations between intake of groups of food items and these mortality endpoints . We concluded that higher intake of certain food groups was associated with lower risk of gastric cancer , heart disease and stroke mortality in a prospect i ve cohort in rural China . Our findings provide additional evidence that increasing intake of grains , vegetables , beans , fruits and nuts may help reduce mortality from these diseases Background Prospect i ve studies in non-Mediterranean population s have consistently related increasing nut consumption to lower coronary heart disease mortality . A small protective effect on all-cause and cancer mortality has also been suggested . To examine the association between frequency of nut consumption and mortality in individuals at high cardiovascular risk from Spain , a Mediterranean country with a relatively high average nut intake per person . Methods We evaluated 7,216 men and women aged 55 to 80 years r and omized to 1 of 3 interventions ( Mediterranean diets supplemented with nuts or olive oil and control diet ) in the PREDIMED ( ‘ PREvención con DIeta MEDiterránea ’ ) study . Nut consumption was assessed at baseline and mortality was ascertained by medical records and linkage to the National Death Index . Multivariable-adjusted Cox regression and multivariable analyses with generalized estimating equation models were used to assess the association between yearly repeated measurements of nut consumption and mortality . Results During a median follow-up of 4.8 years , 323 total deaths , 81 cardiovascular deaths and 130 cancer deaths occurred . Nut consumption was associated with a significantly reduced risk of all-cause mortality ( P for trend < 0.05 , all ) . Compared to non-consumers , subjects consuming nuts > 3 servings/week ( 32 % of the cohort ) had a 39 % lower mortality risk ( hazard ratio ( HR ) 0.61 ; 95 % CI 0.45 to 0.83 ) . A similar protective effect against cardiovascular and cancer mortality was observed . Participants allocated to the Mediterranean diet with nuts group who consumed nuts > 3 servings/week at baseline had the lowest total mortality risk ( HR 0.37 ; 95 % CI 0.22 to 0.66 ) . Conclusions Increased frequency of nut consumption was associated with a significantly reduced risk of mortality in a Mediterranean population at high cardiovascular risk . Please see related commentary : http://www.biomed central .com/1741 - 7015/11/165.Trial registration Clinical trials.gov . International St and ard R and omized Controlled Trial Number ( IS RCT N ) : 35739639 . Registration date : 5 October 2005 Nut intake has been associated with reduced inflammatory status and lower risk of CVD and mortality . The aim of this study was to examine the relationship between nut consumption and mortality and the role of inflammation . We conducted a population -based prospect i ve investigation on 19 386 subjects enrolled in the Moli-sani study . Food intake was recorded by the Italian version of the European Project Investigation into Cancer and Nutrition FFQ . C-reactive protein , leucocyte and platelet counts and the neutrophil : lymphocyte ratio were used as biomarkers of low- grade inflammation . Hazard ratios ( HR ) were calculated using multivariable Cox proportional hazard models . During a median follow-up of 4·3 years , 334 all-cause deaths occurred . As compared with subjects who never ate nuts , rare intake ( ≤2 times/month ) was inversely associated with mortality ( multivariable HR=0·68 ; 95 % CI 0·54 , 0·87 ) . At intake ≥8 times/month , a greater protection was observed ( HR=0·53 ; 0·32 , 0·90 ) . Nut intake ( v. no intake ) conveyed a higher protection to individuals poorly adhering to the Mediterranean diet ( MD ) . A significant reduction in cancer deaths ( HR=0·64 ; 95 % CI 0·44 , 0·94 ) was also observed , whereas the impact on CVD deaths was limited to an inverse , but not significant , trend . Biomarkers of low- grade inflammation were reduced in nut consumers but did not account for the association with mortality . In conclusion , nut intake was associated with reduced cancer and total mortality . The protection was stronger in individuals with lower adherence to MD , whereas it was similar in high-risk groups ( diabetics , obese , smokers or those with the metabolic syndrome ) , as compared with low-risk subjects . Inflammation did not explain the observed relationship Objective To investigate dietary determinants of ischaemic heart disease ( IHD ) in health conscious individuals to explain the reduced risk in vegetarians , and to examine the relation between IHD and body mass index ( BMI ) within the normal range . Design Prospect i ve observation of vegetarians , semi-vegetarians , and meat eaters for whom baseline dietary data , reported weight and height information , social class , and smoking habits were recorded . Subjects 10 802 men and women in the UK aged between 16 and 79 , mean duration of follow up 13.3 years . Main outcome measures Death rate ratios for IHD and total mortality in relation to dietary and other characteristics recorded at recruitment ( reference category death rate = 100 ) . Results IHD mortality was less than half that expected from the experience reported for all of Engl and and Wales . An increase in mortality for IHD was observed with increasing intakes of total and saturated animal fat and dietary cholesterol — death rate ratios in the third tertile compared with the first tertile : 329 , 95 % confidence interval ( CI ) 150 to 721 ; 277 , 95 % CI 125 to 613 ; 353 , 95 % CI 157 to 796 , respectively . No protective effects were observed for dietary fibre , fish or alcohol . Within the study , death rate ratios were increased among those in the upper half of the normal BMI range ( 22.5 to < 25 ) and those who were overweight ( BMI ⩾ 25 ) compared with those with BMI 20 to < 22.5 . Conclusions In these relatively health conscious individuals the deleterious effects of saturated animal fat and dietary cholesterol appear to be more important in the aetiology of IHD than the protective effect of dietary fibre . Reduced intakes of saturated animal fat and cholesterol may explain the lower rates of IHD among vegetarians compared with meat eaters . Increasing BMI within the normal range is associated with increased risk of IHD . The results have important public health implication STUDY OBJECTIVE To study the association between reported milk consumption and cardiovascular and all cause mortality . DESIGN A prospect i ve study of 5765 men aged 35–64 at the time of examination . SETTING Workplaces in the west of Scotl and between 1970 and 1973 . PARTICIPANTS Men who completed a health and lifestyle question naire , which asked about daily milk consumption , and who attended for a medical examination . MAIN RESULTS 150 ( 2.6 % ) men reported drinking more than one and a third pints a day , Some 2977 ( 51.6 % ) reported drinking between a third and one and a third pints a day and 2638 ( 45.8 % ) reported drinking less than a third of a pint a day . There were a total of 2350 deaths over the 25 year follow up period , of which 892 deaths were attributed to coronary heart disease . The relative risk , adjusted for socioeconomic position , health behaviours and health status for deaths from all causes for men who drank one third to one and a third pints a day versus those who drank less than a third of a pint was 0.90 ( 95 % CI 0.83 , 0.97 ) . The adjusted relative risk for deaths attributed to coronary heart disease for men who drank one third to one and a third pints a day versus those who drank less than one third of a pint was 0.92 ( 95 % CI 0.81 , 1.06 ) . CONCLUSIONS No evidence was found that men who consumed milk each day , at a time when most milk consumed was full fat milk , were at increased risk of death from all causes or death from coronary heart disease Purpose Existing data from prospect i ve cohort studies on dairy consumption and cardiovascular diseases are inconsistent . Even though the association between total dairy and cardiovascular diseases has been studied before , little is known about the effect of different types of dairy products on cardiovascular diseases ( CVD ) . The objective of this study was to examine the relationship between ( type of ) dairy intake and CVD mortality and all-cause mortality in a Dutch population . Methods We examined the relationship between dairy intake and CVD mortality and all-cause mortality in 1956 participants of the Hoorn Study ( aged 50–75 years ) , free of CVD at baseline . Hazard ratios with 95 % CIs were obtained for CVD mortality and all-cause mortality per st and ard deviation ( SD ) of the mean increase in dairy intake , with adjustment for age , sex , BMI , smoking , education , total energy intake , alcohol consumption , physical activity , and dietary intakes . Results During 12.4 years of follow-up , 403 participants died , of whom 116 had a fatal CVD event . Overall dairy intake was not associated with CVD mortality or all-cause mortality . Each SD increase in high-fat dairy intake was associated with a 32 % higher risk of CVD mortality ( 95 % CI ; 7–61 % ) . Conclusion In this prospect i ve cohort study , the intake of high-fat dairy products was associated with an increased risk of CVD mortality Objective The aim of this study was to evaluate adherence to the Mediterranean Diet ( MD ) and its association with all-cause mortality in an elderly Italian population . Design Data analysis of a longitudinal study of a representative , age stratified , population sample . Setting Study data is based upon the Italian Longitudinal Study on Aging ( ILSA ) a prospect i ve , community-based cohort study . The baseline evaluation was carried out in 1992 and the follow-up in 1996 and 2000.ParticipantParticipant food intake assessment was available at baseline for 4,232 subjects ; information on survival was available for 2,665 at the 2000 follow-up . Measurements Adherence to the MD was evaluated with an a priori score based on the Mediterranean pyramid components . Cox proportional hazard models were used to assess the relationship between the MD score and all-cause mortality . Six hundred and sixty five subjects had died at the second follow-up ( identified up to the first and second follow-up together ; mean follow-up : 7.1±2.6 years ) . Results At the 2000 follow-up , adjusting for other confounding factors , participants with a high adherence to MD ( highest tertile of the MD score distribution ) had an all-cause mortality risk that was of 34 % lower with respect to the subjects with low adherence ( Hazard Ratio=0.66 ; 95 % CI : 0.49 - 0.90 ; p=0.0144 ) . Conclusion According to study results , a higher adherence to the MD was associated with a low all-cause mortality risk in an elderly Italian population A prospect i ve cohort study , involving 141 Anglo-Celts and 189 Greek-Australians of both sexes aged 70 years or more , was undertaken in Melbourne , Australia . The objective was to evaluate whether adherence to the principles of the Mediterranean diet affects survival of elderly people in developed non-Mediterranean countries . Diet was assessed using an extensive vali date d question naire on food intake . A one unit increase in a diet score , devised a priori on the basis of eight key features of the traditional common diet in the Mediterranean region , was associated with a 17 % reduction in overall mortality ( two-tailed P value 0.07 ) . Mortality reduction with increasing diet score was at least as evident among Anglo-Celts as among Greek-Australians . We conclude that a diet that adheres to the principles of the traditional Mediterranean diet is associated with longer survival among Australians of either Greek or Anglo-Celtic origin BACKGROUND High red meat consumption is associated with a shorter survival and higher risk of cardiovascular disease ( CVD ) , cancer , and all-cause mortality . Fruit and vegetable ( FV ) consumption is associated with a longer survival and lower mortality risk . Whether high FV consumption can counterbalance the negative impact of high red meat consumption is unknown . OBJECTIVE We evaluated 2 large prospect i ve cohorts of Swedish men and women ( the Swedish Mammography Cohort and the Cohort of Swedish Men ) to determine whether the association between red meat consumption and the risk of all-cause , CVD , and cancer-specific mortality differs across amounts of FV intake . DESIGN The study population included 74,645 Swedish men and women . Red meat and FV consumption were assessed through a self-administered question naire . We estimated HRs of all-cause , CVD , and cancer mortality according to quintiles of total red meat consumption . We next investigated possible interactions between red meat and FV consumption and evaluated the dose-response associations at low , medium , and high FV intake . RESULTS Compared with participants in the lowest quintile of total red meat consumption , those in the highest quintile had a 21 % increased risk of all-cause mortality ( HR : 1.21 ; 95 % CI : 1.13 , 1.29 ) , a 29 % increased risk of CVD mortality ( HR : 1.29 ; 95 % CI : 1.14 , 1.46 ) , and no increase in the risk of cancer mortality ( HR : 1.00 ; 95 % CI : 0.88 , 1.43 ) . Results were remarkably similar across amounts of FV consumption , and no interaction between red meat and FV consumption was detected . CONCLUSION High intakes of red meat were associated with a higher risk of all-cause and CVD mortality . The increased risks were consistently observed in participants with low , medium , and high FV consumption . The Swedish Mammography Cohort and the Cohort of Swedish Men were registered at clinical trials.gov as NCT01127698 and NCT01127711 , respectively Objective : To test the hypothesis that milk drinking increases the risk of ischaemic heart disease ( IHD ) and ischaemic stroke in a prospect i ve study . Design : In the Caerphilly Cohort Study dietary data , including milk consumption , were collected by a semiquantitative food frequency question naire in 1979–1983 . The cohort has been followed for 20–24 y and incident IHD and stroke events identified . Subjects : A representative population sample in South Wales , of 2512 men , aged 45–59 y at recruitment . Main outcome measures : In total , 493 men had an IHD event and 185 an ischaemic stroke during follow-up . Results : After adjustment , the hazard ratio in men with a milk consumption of one pint ( 0.57 l ) or more per day , relative to men who stated that they consumed no milk , is 0.71 ( 0.40–1.26 ) for IHD and 0.66 ( 0.24–1.81 ) for ischaemic stroke . At baseline , 606 men had had clinical or ECG evidence of vascular disease , and in these the vascular risk was even lower ( 0.37 ; 0.15–0.90 ) . The hazard ratio for IHD and ischaemic stroke combined is 0.64 ( 0.39–1.06 ) in all men and 0.37 ( 0.15–0.90 ) in those who had had a prior vascular event . Conclusion : The data provide no convincing evidence that milk consumption is associated with an increase in vascular disease risk . Evidence from an overview of all published cohort studies on this topic should be informative . Sponsorship : The Medical Research Council , the University of Wales College of Medicine and Bristol University . Current support is from the Food St and ards Agency Background / Objectives : Dairy foods contain various nutrients that may affect health . We investigated whether intake of dairy products or related nutrients is associated with mortality due to cardiovascular disease ( CVD ) , cancer and all causes . Subjects/ Methods : We carried out a 16-year prospect i ve study among a community-based sample of 1529 adult Australians aged 25–78 years at baseline . Habitual intakes of dairy products ( total , high/low-fat dairy , milk , yoghurt and full-fat cheese ) , calcium and vitamin D were estimated as mean reported intake using vali date d food frequency question naires ( FFQs ) self-administered in 1992 , 1994 and 1996 . National Death Index data were used to ascertain mortality and cause of death between 1992 and 2007 . Hazard ratios ( HRs ) were calculated using Cox regression analysis . Results : During an average follow-up time of 14.4 years , 177 participants died , including 61 deaths due to CVD and 58 deaths due to cancer . There was no consistent and significant association between total dairy intake and total or cause-specific mortality . However , compared with those with the lowest intake of full-fat dairy , participants with the highest intake ( median intake 339 g/day ) had reduced death due to CVD ( HR : 0.31 ; 95 % confidence interval ( CI ) : 0.12–0.79 ; P for trend=0.04 ) after adjustment for calcium intake and other confounders . Intakes of low-fat dairy , specific dairy foods , calcium and vitamin D showed no consistent associations . Conclusions : Overall intake of dairy products was not associated with mortality . A possible beneficial association between intake of full-fat dairy and cardiovascular mortality needs further assessment and confirmation Several healthy dietary patterns have been linked to longevity . Recently , a Nordic dietary pattern was associated with a lower overall mortality . No study has , however , investigated this dietary pattern in relation to cause-specific mortality . The aim of the present study was to examine the association between adherence to a healthy Nordic food index ( consisting of wholegrain bread , oatmeal , apples/pears , root vegetables , cabbages and fish/shellfish ) and overall mortality , and death by cardiovascular disease , cancer , injuries/suicide and other causes . We conducted a prospect i ve analysis in the Swedish Women ’s Lifestyle and Health cohort , including 44,961 women , aged 29–49 years , who completed a food frequency question naire between 1991–1992 , and have been followed up for mortality ever since , through Swedish registries . The median follow-up time is 21.3 years , and mortality rate ratios ( MRR ) were calculated using Cox Proportional Hazards Models . Compared to women with the lowest index score ( 0–1 points ) , those with the highest score ( 4–6 points ) had an 18 % lower overall mortality ( MRR 0.82 ; 0.71–0.93 , p < 0.0004 ) . A 1-point increment in the healthy Nordic food index was associated with a significantly lower risk of all-cause mortality : 6 % ( 3–9 % ) , cancer mortality : 5 % ( 1–9 % ) and mortality from other causes : 16 % ( 8–22 % ) . When examining the diet components individually , only wholegrain bread and apples/pears were significantly inversely associated with all-cause mortality . We observed no effect-modification by smoking status , BMI or age at baseline . The present study encourages adherence to a healthy Nordic food index , and warrants further investigation of the strong association with non-cancer , non-cardiovascular and non-injury/suicide deaths BACKGROUND : Beverages are contributing an increased proportion of energy to the diet . Because they elicit a weak compensatory dietary response , they may increase risk of positive energy balance . OBJECTIVES : This study aim ed to document the differential effects of matched liquid and solid carbohydrate loads on diet and body weight . DESIGN : In a cross-over design , seven males and eight females consumed dietary carbohydrate loads of 1880 kJ/day as a liquid ( soda ) or solid ( jelly beans ) during two 4 week periods separated by a 4 week washout . Subjects were permitted to consume the loads however they chose . In addition to baseline measurements , diet records were obtained on r and om days throughout the study , body composition was measured weekly , physical activity was assessed before and after treatments and hunger was assessed during washout and midway through each treatment . RESULTS : Free-feeding energy intake during the solid period was significantly lower than intake prior to this period . Dietary energy compensation was precise ( 118 % ) . No decrease in free-feeding energy intake occurred during the liquid period . Total daily energy intake increased by an amount equal to the load result ing in dietary compensation of −17 % . Consequently , body weight and BMI increased significantly only during the liquid period . Physical activity and hunger were unchanged . CONCLUSIONS : This study indicates that liquid carbohydrate promotes positive energy balance , whereas a comparable solid carbohydrate elicits precise dietary compensation . Increased consumption of energy-yielding fluids may promote positive energy balance Objective : A number of long-term population -based studies have tried to study fruit and vegetable consumption in relation to cardiovascular disease , cancer and total mortality . Few of these studies are based on r and omly selected population sample s. The aim of the study was to investigate the long-term effect of fruit and vegetable consumption on mortality , cardiovascular disease , cardiovascular death , cancer morbidity and cancer death among middle-aged and elderly men . Design : Prospect i ve cohort study . Setting : General community . The Study of Men Born in 1913.Subjects : 792 men at age 54 who participated in a screening examination in 1967.Main outcome measures : A food frequency question naire was used to obtain information of the dietary habits in 730 of the men ( 92 % ) . All men were followed up with repeated examinations until the age of 80 . Results : Cardiovascular as well as total mortality was significantly lower among men with high fruit consumption in univariate analysis . There was no correlation between fruit or vegetable consumption in relation to cancer incidence , cancer death and cardiovascular disease . In multivariate survival analysis where smoking , cholesterol and hypertension were taken into account , there was a significantly lower mortality among men with a high fruit consumption during 16 y follow up until the age of 70 ( P=0.042 ) , but this finding was no longer statistically significant during 26 y follow-up at the age of 80 ( P=0.051 ) . Conclusions . Daily fruit consumption seems to have positive effect on long-term survival independently of other traditional cardiovascular risk factors like smoking , hypertension and cholesterol . Sponsorship : This study was supported by grants from the Swedish Medical Research Council ( K98 - 274 - 06276 - 17 ) King Gustav V and Queen Victoria ’s Foundation , and the Göteborg University . European Journal of Clinical Nutrition ( 2000 ) 54 , In this study , the relation between fruit and vegetable consumption and mortality was investigated within the European Prospect i ve Investigation Into Cancer and Nutrition . Survival analyses were performed , including 451,151 participants from 10 European countries , recruited between 1992 and 2000 and followed until 2010 . Hazard ratios , rate advancement periods , and preventable proportions to respectively compare risk of death between quartiles of consumption , to estimate the period by which the risk of death was postponed among high consumers , and to estimate proportions of deaths that could be prevented if all participants would shift their consumption 1 quartile upward . Consumption of fruits and vegetables was inversely associated with all-cause mortality ( for the highest quartile , hazard ratio = 0.90 , 95 % confidence interval ( CI ) : 0.86 , 0.94 ) , with a rate advancement period of 1.12 years ( 95 % CI : 0.70 , 1.54 ) , and with a preventable proportion of 2.95 % . This association was driven mainly by cardiovascular disease mortality ( for the highest quartile , hazard ratio = 0.85 , 95 % CI : 0.77 , 0.93 ) . Stronger inverse associations were observed for participants with high alcohol consumption or high body mass index and suggested in smokers . Inverse associations were stronger for raw than for cooked vegetable consumption . These results support the evidence that fruit and vegetable consumption is associated with a lower risk of death Higher intake of fruits , vegetables , and antioxidants may help protect against oxidative damage , thus lowering cancer and cardiovascular disease risk . This Washington County , Maryl and , prospect i ve study examined the association of fruit , vegetable , and antioxidant intake with all-cause , cancer , and cardiovascular disease death . CLUE participants who donated a blood sample in 1974 and 1989 and completed a food frequency question naire in 1989 ( N = 6,151 ) were included in the analysis . Participants were followed to date of death or January 1 , 2002 . Compared with those in the bottom fifth , participants in the highest fifth of fruit and vegetable intake had a lower risk of all-cause ( cases = 910 ; hazard ratio ( HR ) = 0.63 , 95 % confidence interval ( CI ) : 0.51 , 0.78 ; p-trend = 0.0004 ) , cancer ( cases = 307 ; HR = 0.65 , 95 % CI : 0.45 , 0.93 ; p-trend = 0.08 ) , and cardiovascular disease ( cases = 225 ; HR = 0.76 , 95 % CI : 0.54 , 1.06 ; p-trend = 0.15 ) mortality . Higher intake of cruciferous vegetables was associated with lower risk of all-cause mortality ( HR = 0.74 , 95 % CI : 0.60 , 0.91 ; p-trend = 0.04 ) . No statistically significant associations were observed between dietary vitamin C , vitamin E , and beta-carotene intake and mortality . Overall , greater intake of fruits and vegetables was associated with lower risk of all-cause , cancer , and cardiovascular disease death . These findings support the general health recommendation to consume multiple servings of fruits and vegetables ( 5 - 9/day ) OBJECTIVE To determine whether responses to simple dietary questions are associated with specific causes of death . DESIGN Self-reported frequency intakes of various classes of foods and data on confounding factors were collected at the baseline survey . Death notifications up to 31 December 1997 were ascertained from the Office for National Statistics . Relative risk ( RR ) of death and 95 % confidence intervals ( CI ) associated with baseline dietary factors were calculated by Cox regression . SETTING Prospect i ve follow-up study based on five UK general practice s. SUBJECTS Data were used from 11,090 men and women aged 35 - 64 years ( 81 % of the eligible patient population ) who responded to a postal question naire in 1989 . RESULTS After 9 years of follow-up , 598 deaths were recorded , 514 of these among the 10,522 subjects with no previous history of angina . All-cause mortality was positively associated with age , smoking and low social class , as expected . Among the dietary variables , all-cause mortality was significantly reduced in participants who reported relatively high consumption of vegetables , puddings , cakes , biscuits and sweets , fresh or frozen red meat ( but not processed meat ) , among those who reported using polyunsaturated spreads and among moderate alcohol drinkers . These associations were broadly similar for deaths from ischaemic heart disease ( IHD ) , cancer and all other causes combined , and were not greatly attenuated by adjusting for potential confounding factors including social class . CONCLUSIONS Responses to simple questions about nutrition were associated with mortality . These findings must be interpreted with caution since residual confounding by dietary and lifestyle factors may underlie the associations Few prospect i ve studies have examined the effects of different types of dairy food on the risks of type 2 diabetes , CHD and mortality . We examined whether intakes of total dairy , high-fat dairy , low-fat dairy , milk and fermented dairy products were related to these outcomes in the Whitehall II prospect i ve cohort study . At baseline , dairy consumption was assessed by FFQ among 4526 subjects ( 72 % men ) with a mean age 56 ( sd 6 ) years . Death certificates and medical records were used to ascertain CHD mortality and non-fatal myocardial infa rct ion . Incident diabetes was detected by the oral glucose tolerance test or self-report . Incidence data were analysed using Cox proportional hazards models , adjusted for lifestyle and dietary factors . During approximately 10 years of follow-up , 273 diabetes , 323 CHD and 237 all-cause mortality cases occurred . In multivariable models , intakes of total dairy and types of dairy products were not significantly associated with incident diabetes or CHD ( all P values for trend > 0·1 ) . Fermented dairy products was inversely associated with overall mortality ( hazard ratios approximately 0·7 in the middle and highest tertiles ; P for trend < 0·01 ) but not with incident CHD or diabetes ( P>0·3 ) . In conclusion , intakes of total dairy and types of dairy products showed no consistent relationship with incident diabetes , CHD or all-cause mortality The available large prospect i ve studies supporting an inverse association between better adherence to the Mediterranean diet and lower mortality have mainly included older adults . It is not clear whether this inverse association is also present among younger individuals at lower mortality risk . Our aim was to assess the association between adherence to the Mediterranean diet and total mortality in middle-aged adults from the Seguimiento Universidad de Navarra ( SUN ) Project . We followed 15,535 Spanish university graduates for a mean of 6.8 y. Their mean age was 38 ± 12 y , 59.6 % were females , and all were initially free of cardiovascular disease , cancer , and diabetes . A vali date d FFQ was used to assess dietary habits . Adherence to the Mediterranean diet was categorized into 3 groups according to the Mediterranean Diet Score ( low , 0 - 2 points ; moderate , 3 - 5 points ; and high , 6 - 9 points ) . The outcome variable was total mortality . Cox proportional hazards models were used to estimate HR and 95 % CI . We adjusted the estimates for sex , age , years of university education , BMI , smoking , physical activity , television watching , history of depression and baseline hypertension , and hypercholesterolemia . We observed 125 deaths during 105,980 person-years of follow-up . The fully adjusted HR for moderate and high adherence were 0.58 ( 95 % CI : 0.34 , 0.99 ; P = 0.05 ) and 0.38 ( 95 % CI : 0.21 , 0.70 ; P = 0.002 ) , respectively . For each 2-point increment in the Mediterranean Diet Score , the HR of death was 0.72 ( 95 % CI : 0.58 , 0.91 ; P = 0.006 ) . Among highly educated , middle-aged adults , adherence to the traditional Mediterranean diet was associated with reduced risk of death The Mediterranean diet has been widely promoted and may be associated with chronic disease prevention and a better overall health status . The aim of this study was to evaluate whether the Mediterranean diet score inversely predicted total or cause-specific mortality in a prospect i ve population study in Northern Sweden ( Västerbotten Intervention Program ) . The analyses were performed in 77,151 participants ( whose diet was measured by means of a vali date d FFQ ) by Cox proportional hazard models adjusted for several potential confounders . The Mediterranean diet score was inversely associated with all-cause mortality in men [ HR = 0.96 ( 95 % CI = 0.93 , 0.99 ) ] and women [ HR = 0.95 ( 95 % CI = 0.91 , 0.99 ) ] , although not in obese men . In men , but not in women , the score was inversely associated with total cancer mortality [ HR = 0.92 ( 95 % CI = 0.87 , 0.98 ) ] , particularly for pancreas cancer [ HR = 0.82 ( 95 % CI = 0.68 , 0.99 ) ] . Cardiovascular mortality was inversely associated with diet only in women [ HR = 0.90 ( 95 % CI = 0.82 , 0.99 ) ] . Except for alcohol [ HR = 0.83 ( 95 % CI = 0.76 , 0.90 ) ] and fruit intake [ HR = 0.90 ( 95 % CI = 0.83 , 0.98 ) ] , no food item of the Mediterranean diet score independently predicted mortality . Higher scores were associated with increasing age , education , and physical activity . Moreover , healthful dietary and lifestyle-related factors additively decreased the mortality likelihood . Even in a suba rct ic region , increasing Mediterranean diet scores were associated with a longer life , although the protective effect of diet was of small magnitude compared with other healthful dietary and lifestyle-related factors examined BACKGROUND Asian population s habitually consume a large amount of cruciferous vegetables and other plant-based foods . Few epidemiologic investigations have evaluated the potential health effects of these foods in Asian population s. OBJECTIVE We aim ed to examine the associations of cruciferous vegetables , noncruciferous vegetables , total vegetables , and total fruit intake with risk of all-cause and cause-specific mortality . DESIGN The analysis included 134,796 Chinese adults who participated in 2 population -based , prospect i ve cohort studies : the Shanghai Women 's Health Study and the Shanghai Men 's Health Study . Dietary intakes were assessed at baseline through in-person interviews by using vali date d food-frequency question naires . Deaths were ascertained by biennial home visits and linkage with vital statistics registries . RESULTS We identified 3442 deaths among women during a mean follow-up of 10.2 y and 1951 deaths among men during a mean follow-up of 4.6 y. Overall , fruit and vegetable intake was inversely associated with risk of total mortality in both women and men , and a dose-response pattern was particularly evident for cruciferous vegetable intake . The pooled multivariate hazard ratios ( 95 % CIs ) for total mortality across increasing quintiles of intake were 1 ( reference ) , 0.91 ( 0.84 , 0.98 ) , 0.88 ( 0.77 , 1.00 ) , 0.85 ( 0.76 , 0.96 ) , and 0.78 ( 0.71 , 0.85 ) for cruciferous vegetables ( P < 0.0001 for trend ) and 0.88 ( 0.79 , 0.97 ) , 0.88 ( 0.79 , 0.98 ) , 0.76 ( 0.62 , 0.92 ) , and 0.84 ( 0.69 , 1.00 ) for total vegetables ( P = 0.03 for trend ) . The inverse associations were primarily related to cardiovascular disease mortality but not to cancer mortality . CONCLUSION Our findings support recommendations to increase consumption of vegetables , particularly cruciferous vegetables , and fruit to promote cardiovascular health and overall longevity PURPOSE Although high consumption of fish may be one of the contributing factors for Japanese longevity , no epidemiological study using Japanese data has tested this hypothesis . SUBJECTS AND METHODS The relationship between fish consumption and all-cause as well as cause-specific mortality was analyzed using the data base of NIPPON DATA 80 . At baseline in 1980 , history , physical , and blood biochemical measurement and a nutritional survey by the food-frequency method were performed in r and omly selected community-based subjects aged 30 years and over in Japan . After exclusion of subjects with significant comorbidities at baseline , we followed 3945 men and 4934 women for 19 years . Men and women were analyzed comprehensively . Age- and sex-adjusted and multivariate adjusted relative risk for all-cause or cause-specific mortality was calculated using a Cox proportional hazards model with delayed entry . RESULTS During 19 years of followup , there were 1745 deaths . Subjects were divided into 5 groups according to fish consumption frequency . The multivariate Cox analyses showed that relative risks for subjects who ate fish more than twice daily compared with those of subjects who ate 1 to 2 times weekly were 0.99 ( 95 % confidence intervals : 0.77 - 1.27 ) for all-cause , 1.26 ( 0.70 - 2.29 ) for stroke , 0.92 ( 0.20 - 4.23 ) for cerebral hemorrhage , 1.09 ( 0.48 - 2.43 ) for cerebral infa rct ion , and 0.91 ( 0.35 - 2.35 ) for coronary heart disease mortality . CONCLUSION Our results did not provide evidence in support of the fish hypothesis , perhaps because the majority of the Japanese subjects in the study ate fish more than the threshold level shown to be beneficial in the previous studies Objective To examine the association between adherence to the Japanese Food Guide Spinning Top and total and cause specific mortality . Design Large scale population based prospect i ve cohort study in Japan with follow-up for a median of 15 years . Setting 11 public health centre areas across Japan . Participants 36 624 men and 42 970 women aged 45 - 75 who had no history of cancer , stroke , ischaemic heart disease , or chronic liver disease . Main outcome measures Deaths and causes of death identified with the residential registry and death certificates . Results Higher scores on the food guide ( better adherence ) were associated with lower total mortality ; the multivariable adjusted hazard ratios ( 95 % confidence interval ) of total mortality for the lowest through highest scores were 1.00 , 0.92 ( 0.87 to 0.97 ) , 0.88 ( 0.83 to 0.93 ) , and 0.85 ( 0.79 to 0.91 ) ( P<0.001 for trend ) and the multivariable adjusted hazard ratio associated with a 10 point increase in food guide scores was 0.93 ( 0.91 to 0.95 ; P<0.001 for trend ) . This score was inversely associated with mortality from cardiovascular disease ( hazard ratio associated with a 10 point increase 0.93 , 0.89 to 0.98 ; P=0.005 for trend ) and particularly from cerebrovascular disease ( 0.89 , 0.82 to 0.95 ; P=0.002 for trend ) . There was some evidence , though not significant , of an inverse association for cancer mortality ( 0.96 , 0.93 to 1.00 ; P=0.053 for trend ) . Conclusion Closer adherence to Japanese dietary guidelines was associated with a lower risk of total mortality and mortality from cardiovascular disease , particularly from cerebrovascular disease , in Japanese adults This study investigate the relation between fish consumption , all-cause mortality , and incidence of coronary heart disease ( CHD ) . A total of 4,513 men and 3,984 women aged 30 - 70 years , sample d r and omly from the population in Copenhagen County , Denmark , with initially examination in 1982 - 1992 was followed until 2000 for all-cause mortality and until 1997 for first admission to hospital or death from CHD . Information on fish consumption was obtained from a self-administered food-frequency question naire . Cox proportional hazard analysis gave no evidence for an inverse association between fish consumption and all-cause mortality or incident CHD after adjustment for confounders . Among subjects with a priory-defined high risk of CHD there was a nonsignificant inverse relation between fish intake and CHD morbidity ( Hazard Ratio 1.28 ( 0.92 - 1.80 ) for a consumption of fish of less than two times per month or less compared with once a week ) , but there was relatively few cases in this subgroup . These data provides no evidence for a protective effect of fish consumption on all-cause mortality or incident CHD in the population as a whole , but it can not be excluded that frequent consumption of fish benefits those at high risk for CHD BACKGROUND Although previous studies have linked intake of sugars with incidence of cancer and other chronic diseases , its association with mortality remains unknown . OBJECTIVE We investigated the association of total sugars , added sugars , total fructose , added fructose , sucrose , and added sucrose with the risk of all-cause , cardiovascular disease , cancer , and other-cause mortality in the NIH-AARP Diet and Health Study . DESIGN The participants ( n = 353,751 ) , aged 50 - 71 y , were followed for up to 13 y. Intake of individual sugars over the previous 12 mo was assessed at baseline by using a 124-item NIH Diet History Question naire . RESULTS In fully adjusted models ( fifth quartile compared with first quartile ) , all-cause mortality was positively associated with the intake of total sugars [ HR ( 95 % CI ) : 1.13 ( 1.06 , 1.20 ) ; P-trend < 0.0001 ] , total fructose [ 1.10 ( 1.04 , 1.17 ) ; P-trend < 0.0001 ] , and added fructose [ 1.07 ( 1.01 , 1.13 ) ; P-trend = 0.005 ) in women and total fructose [ 1.06 ( 1.01 , 1.10 ) ; P-trend = 0.002 ] in men . In men , a weak inverse association was found between other-cause mortality and dietary added sugars ( P-trend = 0.04 ) , sucrose ( P-trend = 0.03 ) , and added sucrose ( P-trend = 0.006 ) . Investigation of consumption of sugars by source showed that the positive association with mortality risk was confined only to sugars from beverages , whereas the inverse association was confined to sugars from solid foods . CONCLUSIONS In this large prospect i ve study , total fructose intake was weakly positively associated with all-cause mortality in both women and men , whereas added sugar , sucrose , and added sucrose intakes were inversely associated with other-cause mortality in men . In our analyses , intake of added sugars was not associated with an increased risk of mortality . The NIH-AARP Diet and Health Study was registered at clinical trials.gov as NCT00340015 BACKGROUND We examined associations between fish and n-3 LCFA and mortality in a prospect i ve study with a large proportion of blacks with low socio-economic status . METHODS AND RESULTS We observed 6914 deaths among 77,604 participants with dietary data ( follow-up time 5.5 years ) . Of these , 77,100 participants had available time-to-event data . We investigated associations between mortality with fish and n-3 LCFA intake , adjusting for age , race , sex , kcal/day , body mass index ( BMI ) , smoking , alcohol consumption , physical activity , income , education , chronic disease , insurance coverage , and meat intake . Intakes of fried fish , baked/grilled fish and total fish , but not tuna , were associated with lower mortality among all participants . Analysis of trends in overall mortality by quintiles of intake showed that intakes of fried fish , baked/grilled fish and total fish , but not tuna , were associated with lower risk of total mortality among all participants . When participants with chronic disease were excluded , the observed association remained only between intakes of baked/grilled fish , while fried fish was associated with lower risk of mortality in participants with prevalent chronic disease . The association between n-3 LCFA intake and lower risk of mortality was significant among those with diabetes at baseline . There was an inverse association of mortality with fried fish intake in men , but not women . Total fish and baked/grilled fish intakes were associated with lower mortality among blacks while fried fish intake was associated with lower mortality among whites . Effect modifications were not statistically significant . CONCLUSION Our findings suggest a modest benefit of fish consumption on mortality Background / Objectives : Long-term observational cohorts provide the opportunity to investigate the potential impact of dietary patterns on death . We aim ed to investigate all-cause death according to the consumption of selected food groups , and then to identify those independently associated with reduced mortality . Subjects/ Methods : Population survey of middle-aged men r and omly selected in the period 1995–1997 from the general population of three French areas and followed over a median of 14.8 years . Dietary data were collected through a 3-day food record . Cox modeling was used to assess the risk of death according to selected foods groups after extensive adjustment for confounders , including a diet quality index . Results : The study population comprised 960 men ( mean age 55.5 ±6.2 years ) . After a median follow-up of 14.8 ( interquartile range 14.3–15.2 ) years , 150 ( 15.6 % ) subjects had died . Food groups that remained independently predictive of a lower risk of death after extensive adjustment were an above-median consumption of milk ( adjusted relative risk : 0.61 , 95 % confidence interval ( CI ) : 0.43–0.86 , P-value=0.005 ) , fruits and vegetables ( 0.68 , 0.46–0.98 , P-value=0.041 ) and a moderate consumption of yogurts and cottage cheese ( 0.50 , 95 % CI : 0.31–0.81 , P-value=0.005 ) , other cheeses ( 0.62 , 0.39–0.97 , P-value=0.036 ) and bread ( 0.57 , 0.37–0.89 , P-value=0.014 ) . Besides , there was a nonsignificant trend for a higher risk of death associated with highest sodium intakes . Conclusions : Consumption of food groups that largely match recommendations is associated with a reduced risk of all-cause death in men . A diet providing moderate amounts of diverse food groups appears associated with the highest life expectancy Aims /hypothesisThus far , it is unclear whether lifestyle recommendations for people with diabetes should be different from those for the general public . We investigated whether the associations between lifestyle factors and mortality risk differ between individuals with and without diabetes . Methods Within the European Prospect i ve Investigation into Cancer and Nutrition ( EPIC ) , a cohort was formed of 6,384 persons with diabetes and 258,911 EPIC participants without known diabetes . Joint Cox proportional hazard regression models of people with and without diabetes were built for the following lifestyle factors in relation to overall mortality risk : BMI , waist/height ratio , 26 food groups , alcohol consumption , leisure-time physical activity , smoking . Likelihood ratio tests for heterogeneity assessed statistical differences in regression coefficients . Results Multivariable adjusted mortality risk among individuals with diabetes compared with those without was increased , with an HR of 1.62 ( 95 % CI 1.51 , 1.75 ) . Intake of fruit , legumes , nuts , seeds , pasta , poultry and vegetable oil was related to a lower mortality risk , and intake of butter and margarine was related to an increased mortality risk . These associations were significantly different in magnitude from those in diabetes-free individuals , but directions were similar . No differences between people with and without diabetes were detected for the other lifestyle factors . Conclusions /interpretationDiabetes status did not substantially influence the associations between lifestyle and mortality risk . People with diabetes may benefit more from a healthy diet , but the directions of association were similar . Thus , our study suggests that lifestyle advice with respect to mortality for patients with diabetes should not differ from recommendations for the general population BACKGROUND Previous studies have suggested that nut consumption is associated with beneficial cardiovascular outcomes . However , limited data are available on the association between nut intake and all-cause mortality . OBJECTIVE Our aim was to test the hypothesis that nut consumption is inversely associated with the risk of all-cause mortality . DESIGN In this prospect i ve cohort study in 20,742 male physicians , we assessed nut intake between 1999 and 2002 via a food-frequency question naire and ascertained deaths through an endpoint committee . We used Cox regression to estimate multivariable-adjusted HRs for death according to nut consumption . In secondary analyses , we evaluated associations of nut consumption with cause-specific mortality . RESULTS During a mean follow-up of 9.6 y , there were 2732 deaths . The mean ( ±SD ) age at baseline was 66.6 ± 9.3 y. Median nut consumption was 1 serving/wk . Multivariable-adjusted HRs ( 95 % CIs ) were 1.0 ( reference ) , 0.92 ( 0.83 , 1.01 ) , 0.85 ( 0.76 , 0.96 ) , 0.86 ( 0.75 , 0.98 ) , and 0.74 ( 0.63 , 0.87 ) for nut consumption of never or < 1 serving/mo , 1 - 3 servings/mo , 1 serving/wk , 2 - 4 servings/wk , and ≥5 servings/wk , respectively ( P-linear trend < 0.0001 ) , after adjustment for age , body mass index , alcohol use , smoking , exercise , prevalent diabetes and hypertension , and intakes of energy , saturated fat , fruit and vegetables , and red meat . In a secondary analysis , results were consistent for cardiovascular disease mortality but only suggestive and non-statistically significant for coronary artery disease and cancer mortality . CONCLUSION Our data are consistent with an inverse association between nut consumption and the risk of all-cause and cardiovascular disease mortality in US male physicians Objective : To study whether mortality is reduced among whole grain eaters in Norway . Design : Non-interventional , prospect i ve , baseline 1977–1983 , followed for mortality through to 1994 . Setting : Three Norwegian counties . Subjects : A total of 16 933 men and 16 915 women ; systematic screening of all residents aged 35–56 y at baseline , not disabled and free of cardiovascular disease ( 79 % response rate).Predictor variable : We combined self-report of type and number of bread slices ( white , light whole grain , dense whole grain ) to form a whole grain bread score , with range 0.05 ( 1 slice per day , made with 5 % whole grain flour ) to 5.4 ( 9 slices per day , made with 60 % whole grain flour ) . Results : Norwegian whole grain bread eaters were less likely to be smokers , were more physically active , had lower serum cholesterol and systolic blood pressure , and ate less total and saturated fat as a proportion of energy intake than white bread eaters . After adjustment for age , energy intake , sex , serum cholesterol , systolic blood pressure , smoking , body mass index , physical activity at leisure and work , and use of cod liver oil or other vitamin supplements , hazard rate ratios ( HRR ) for total mortality were inverse and grade d across whole grain bread score categories ( category 5 vs category 1 HRR : 0.75 , 95 % confidence interval 0.63–0.89 in men and 0.66 , 0.44–0.98 in women ) . Conclusion : Protection by whole grain intake against chronic disease is suggested in Norway , where four times as much whole grain is consumed as in the United States . Sponsorship : Institute for Nutrition Research , University of Oslo and National Health Screening Service , Oslo , NorwayEuropean Journal of Clinical Nutrition ( 2001 ) 55 , Purpose Reports on the protective effect of a Mediterranean diet on mortality usually refer to population s from Mediterranean countries , leaving uncertain whether really diet is the fundamental cause . Our aim was to examine the effect of a Mediterranean diet on mortality in Switzerl and , a country combining cultural influences from Mediterranean and Central European countries within a common national health and statistical registry . Methods In this prospect i ve investigation , we included 17,861 men and women aged ≥16 years who participated 1977–1993 in health studies and were followed up for survival until 2008 by anonymous record linkage with the Swiss National Cohort . A 9-point score Mediterranean Diet Score ( MDS ) was used to assess adherence to a Mediterranean diet . Mortality hazard ratios ( HR ) and 95 % confidence intervals ( CIs ) were calculated by using Cox regression models adjusted for age , sex , survey wave , marital status , smoking , body mass index , language region and nationality . Results In all language regions , MDS was inversely associated with mortality . Consumption of dairy products was also consistently associated with lower mortality . When categorizing dairy food consumption as beneficial instead of harmful , this association between MDS and mortality increased in strength and was partly statistically significant . For all causes of death combined ( HR for a one-point increase in MDS 0.96 , 95 % CI 0.94–0.98 ) , in men ( 0.94 , 0.92–0.97 ) , in women ( 0.98 , 0.95–1.02 ) for cardiovascular diseases ( CVD , 0.96 , 0.92–0.99 ; 0.95 , 0.90–1.00 ; 0.98 , 0.92–1.04 ) and for cancer ( 0.95 , 0.92–0.99 ; 0.92 , 0.88–0.97 ; 0.98 , 0.93–1.04 ) . Conclusions Stronger adherence to a Mediterranean diet was associated with lower all-cause , CVD and cancer mortality , largely independently of cultural background . These associations were primary due to the effect in men . Our finding of a beneficial rather than a deleterious impact of dairy products consumption prompts at considering culturally adapted Mediterranean diet recommendations . However , results should be interpreted with caution since only a crude 1-day dietary estimate was available to assess individuals ’ habitual dietary intake BACKGROUND There is limited research examining beverage habits , one of the most habitual dietary behaviors , with mortality risk . OBJECTIVE This study examined the association between coffee , black and green tea , sugar-sweetened beverages ( soft drinks and juice ) , and alcohol and all-cause and cause-specific mortality . METHODS A prospect i ve data analysis was conducted with the use of the Singapore Chinese Health Study , including 52,584 Chinese men and women ( aged 45 - 74 y ) free of diabetes , cardiovascular disease ( CVD ) , and cancer at baseline ( 1993 - 1998 ) and followed through 2011 with 10,029 deaths . Beverages were examined with all-cause and cause-specific ( cancer , CVD , and respiratory disease ) mortality risk with the use of Cox proportional hazards regression . RESULTS The associations between coffee , black tea , and alcohol intake and all-cause mortality were modified by smoking status . Among never-smokers there was an inverse dose-response association between higher amounts of coffee and black tea intake and all-cause , respiratory-related , and CVD mortality ( black tea only ) . The fully adjusted HRs for all-cause mortality for coffee for < 1/d , 1/d , and ≥2/d relative to no coffee intake were 0.89 , 0.86 , and 0.83 , respectively ( P-trend = 0.0003 ) . For the same black tea categories the HRs were 0.95 , 0.90 , and 0.72 , respectively ( P-trend = 0.0005 ) . Among ever-smokers there was no association between coffee or black tea and the outcomes . Relative to no alcohol , light to moderate intake was inversely associated with all-cause mortality ( HR : 0.87 ; 95 % CI : 0.79 , 0.96 ) in never-smokers with a similar magnitude of association in ever-smokers . There was no association between heavy alcohol intake and all-cause mortality in never-smokers and a strong positive association in ever-smokers ( HR : 1.56 ; 95 % CI : 1.40 , 1.74 ) . Green tea and sugar-sweetened beverages were not associated with all-cause or cause-specific mortality . CONCLUSIONS Higher coffee and black tea intake was inversely associated with mortality in never-smokers , light to moderate alcohol intake was inversely associated with mortality regardless of smoking status , heavy alcohol intake was positively associated with mortality in ever-smokers , and there was no association between sugar-sweetened beverages and green tea and mortality Purpose The association between vegetable and fruit consumption and risk of cancer and cardiovascular disease ( CVD ) has been investigated by several studies , whereas fewer studies have examined consumption of vegetables and fruits in relation to all-cause mortality . Studies on berries , a rich source of antioxidants , are rare . The purpose of the current study was to examine the association between intake of vegetables , fruits and berries ( together and separately ) and the risk of all-cause mortality and cause-specific mortality due to cancer and CVD and subtypes of these , in a cohort with very long follow-up . Methods We used data from a population -based prospect i ve Norwegian cohort study of 10,000 men followed from 1968 through 2008 . Information on vegetable , fruit and berry consumption was available from a food frequency question naire . Association between these and all-cause mortality , cause-specific mortality due to cancers and CVDs were investigated using Cox proportional hazard regression models . Results Men who in total consumed vegetables , fruit and berries more than 27 times per month had an 8–10 % reduced risk of all-cause mortality compared with men with a lower consumption . They also had a 20 % reduced risk of stroke mortality . Consumption of fruit was inversely related to overall cancer mortality , with hazard rate ratios of 0.94 , 0.84 and 0.79 in the second , third and firth quartile , respectively , compared with the first quartile . Conclusion Increased consumption of vegetables , fruits and berries was associated with a delayed risk of all-cause mortality and of mortality due to cancer and stroke BACKGROUND AND AIMS The published literature shows that nut consumption has a favorable impact on health . We aim ed to assess the association between nut consumption and risk of 15-year total mortality , and mortality from cardiovascular disease ( CVD ) ( including ischemic heart disease , IHD , and stroke ) , and cancer . METHODS AND RESULTS Prospect i ve analyses involved 2893 participants aged ≥49 years at baseline . Dietary data were collected by using a semi-quantitative food-frequency question naire , and nut intakes were calculated . Deaths and cause of death were confirmed by data linkage with the Australian National Death Index . Over 15 years , 1044 participants had died , of these 430 had died from stroke and another 430 had died from IHD . Participants in the second tertile of nut consumption versus those in the first tertile of intake had reduced risk of total mortality : multivariable-adjusted HR 0.76 ( 95 % CI 0.65 - 0.89 ) . Participants in the second tertile compared to those in the first tertile had 24 % and 23 % reduced risk of 15-year CVD and IHD mortality , respectively . Associations were more marked in women compared to men . Women in the second versus first tertile of nut consumption had 27 % , 39 % , 34 % and 49 % reduced risk of death from all causes ( n = 489 ) , CVD ( n = 258 ) , IHD ( n = 188 ) and stroke mortality ( n = 101 ) , respectively . CONCLUSIONS Nut consumption was independently associated with a decreased risk of overall and vascular-disease mortality , particularly in women Background Intakes of whole grains and cereal fiber have been inversely associated with the risk of chronic diseases ; however , their relation with total and disease-specific mortality remain unclear . We aim ed to prospect ively assess the association of whole grains and cereal fiber intake with all causes and cause-specific mortality . Methods The study included 367,442 participants from the prospect i ve NIH-AARP Diet and Health Study ( enrolled in 1995 and followed through 2009 ) . Participants with cancer , heart disease , stroke , diabetes , and self-reported end-stage renal disease at baseline were excluded . Results Over an average of 14 years of follow-up , a total of 46,067 deaths were documented . Consumption of whole grains were inversely associated with risk of all-cause mortality and death from cancer , cardiovascular disease ( CVD ) , diabetes , respiratory disease , infections , and other causes . In multivariable models , as compared with individuals with the lowest intakes , those in the highest intake of whole grains had a 17 % ( 95 % CI , 14–19 % ) lower risk of all-cause mortality and 11–48 % lower risk of disease-specific mortality ( all P for trend < 0.023 ) ; those in the highest intake of cereal fiber had a 19 % ( 95 % CI , 16–21 % ) lower risk of all-cause mortality and 15–34 % lower risk of disease-specific mortality ( all P for trend < 0.005 ) . When cereal fiber was further adjusted , the associations of whole grains with death from CVD , respiratory disease and infections became not significant ; the associations with all-cause mortality and death from cancer and diabetes were attenuated but remained significant ( P for trend < 0.029 ) . Conclusions Consumption of whole grains and cereal fiber was inversely associated with reduced total and cause-specific mortality . Our data suggest cereal fiber is one potentially protective component OBJECTIVES The objective of our study was to test the hypothesis that fish or omega-3 polyunsaturated fatty acids ( PUFA ) intakes would be inversely associated with risks of mortality from ischemic heart disease , cardiac arrest , heart failure , stroke , and total cardiovascular disease . BACKGROUND Data on associations of dietary intake of fish and of omega-3 PUFA with risk of cardiovascular disease among Asian societies have been limited . METHODS We conducted a prospect i ve study consisting of 57,972 Japanese men and women . Dietary intakes of fish and omega-3 PUFA were determined by food frequency question naire , and participants were followed up for 12.7 years . Hazard ratios and 95 % confidence intervals were calculated according to quintiles of fish or omega-3 PUFA intake . RESULTS We observed generally inverse associations of fish and omega-3 PUFA intakes with risks of mortality from heart failure ( multivariable hazard ratio [ 95 % confidence interval ] for highest versus lowest quintiles = 0.76 [ 0.53 to 1.09 ] for fish and 0.58 [ 0.36 to 0.93 ] for omega-3 PUFA ) . Associations with ischemic heart disease or myocardial infa rct ion were relatively weak and not statistically significant after adjustment for potential risk factors . Neither fish nor omega-3 PUFA dietary intake was associated with mortality from total stroke , its subtypes , or cardiac arrest . For mortality from total cardiovascular disease , intakes of fish and omega-3 PUFA were associated with 18 % to 19 % lower risk . CONCLUSIONS We found an inverse association between fish and omega-3 PUFA dietary intakes and cardiovascular mortality , especially for heart failure , suggesting a protective effect of fish intake on cardiovascular diseases IMPORTANCE Higher intake of whole grains has been associated with a lower risk of major chronic diseases , such as type 2 diabetes mellitus and cardiovascular disease ( CVD ) , although limited prospect i ve evidence exists regarding whole grains ' association with mortality . OBJECTIVE To examine the association between dietary whole grain consumption and risk of mortality . DESIGN , SETTING , AND PARTICIPANTS We investigated 74 341 women from the Nurses ' Health Study ( 1984 - 2010 ) and 43 744 men from the Health Professionals Follow-Up Study ( 1986 - 2010 ) , 2 large prospect i ve cohort studies . All patients were free of CVD and cancer at baseline . MAIN OUTCOMES AND MEASURES Hazard ratios ( HRs ) for total mortality and mortality due to CVD and cancer according to quintiles of whole grain consumption , which was up date d every 2 or 4 years by using vali date d food frequency question naires . RESULTS We documented 26 920 deaths during 2 727 006 person-years of follow-up . After multivariate adjustment for potential confounders , including age , smoking , body mass index , physical activity , and modified Alternate Healthy Eating Index score , higher whole grain intake was associated with lower total and CVD mortality but not cancer mortality : the pooled HRs for quintiles 1 through 5 , respectively , of whole grain intake were 1 ( reference ) , 0.99 ( 95 % CI , 0.95 - 1.02 ) , 0.98 ( 95 % CI , 0.95 - 1.02 ) , 0.97 ( 95 % CI , 0.93 - 1.01 ) , and 0.91 ( 95 % CI , 0.88 - 0.95 ) for total mortality ( P fortrend < .001 ) ; 1 ( reference ) , 0.94 ( 95 % CI , 0.88 - 1.01 ) , 0.94 ( 95 % CI , 0.87 - 1.01 ) , 0.87 ( 95 % CI , 0.80 - 0.94 ) , and 0.85 ( 95 % CI , 0.78 - 0.92 ) for CVD mortality ( P fortrend < .001 ) ; and 1 ( reference ) , 1.02 ( 95 % CI , 0.96 - 1.08 ) , 1.05 ( 95 % CI , 0.99 - 1.12 ) , 1.04 ( 95 % CI , 0.98 - 1.11 ) , and 0.97 ( 95 % CI , 0.91 - 1.04 ) for cancer mortality ( P fortrend = .43 ) . We further estimated that every serving ( 28 g/d ) of whole grain consumption was associated with a 5 % ( 95 % CI , 2%-7 % ) lower total morality or a 9 % ( 95 % CI , 4%-13 % ) lower CVD mortality , whereas the same intake level was nonsignificantly associated with lower cancer mortality ( HR , 0.98 ; 95 % CI , 0.94 - 1.02 ) . Similar inverse associations were observed between bran intake and CVD mortality , with a pooled HR of 0.80 ( 95 % CI , 0.73 - 0.87 ; P fortrend < .001 ) , whereas germ intake was not associated with CVD mortality after adjustment for bran intake . CONCLUSIONS AND RELEVANCE These data indicate that higher whole grain consumption is associated with lower total and CVD mortality in US men and women , independent of other dietary and lifestyle factors . These results are in line with recommendations that promote increased whole grain consumption to facilitate disease prevention BACKGROUND Total or red meat intake has been shown to be associated with a higher risk of mortality in Western population s , but little is known of the risks in Asian population s. OBJECTIVE We examined temporal trends in meat consumption and associations between meat intake and all-cause and cause-specific mortality in Asia . DESIGN We used ecological data from the United Nations to compare country-specific meat consumption . Separately , 8 Asian prospect i ve cohort studies in Bangladesh , China , Japan , Korea , and Taiwan consisting of 112,310 men and 184,411 women were followed for 6.6 to 15.6 y with 24,283 all-cause , 9558 cancer , and 6373 cardiovascular disease ( CVD ) deaths . We estimated the study -specific HRs and 95 % CIs by using a Cox regression model and pooled them by using a r and om-effects model . RESULTS Red meat consumption was substantially lower in the Asian countries than in the United States . Fish and seafood consumption was higher in Japan and Korea than in the United States . Our pooled analysis found no association between intake of total meat ( red meat , poultry , and fish/seafood ) and risks of all-cause , CVD , or cancer mortality among men and women ; HRs ( 95 % CIs ) for all-cause mortality from a comparison of the highest with the lowest quartile were 1.02 ( 0.91 , 1.15 ) in men and 0.93 ( 0.86 , 1.01 ) in women . CONCLUSIONS Ecological data indicate an increase in meat intake in Asian countries ; however , our pooled analysis did not provide evidence of a higher risk of mortality for total meat intake and provided evidence of an inverse association with red meat , poultry , and fish/seafood . Red meat intake was inversely associated with CVD mortality in men and with cancer mortality in women in Asian countries
11,057
24,766,530
The CR is a rare event in advanced CRC ; however , the addition of MoAbs to first-line chemotherapy significantly increases the curative rate of metastatic disease compared with controls
AIM The study assessed whether the addition of monoclonal antibodies ( MoAbs ) to first-line chemotherapy for advanced colorectal cancer ( CRC ) increases the complete response ( CR ) compared with controls .
Only a few papers have been published concerning the incidence and outcome of patients with a pathological complete response after cytotoxic treatment in breast cancer . The purpose of this retrospective study was to assess the outcome of patients found to have a pathological complete response in both the breast and axillary lymph nodes after neoadjuvant chemotherapy for operable breast cancer . Our goal was also to determine whether the residual pathological size of the tumour in breast could be correlated with pathological node status . Between 1982 and 2000 , 451 consecutive patients were registered into five prospect i ve phase II trials . After six cycles , 396 patients underwent surgery with axillary dissection for 277 patients ( 69.9 % ) . Pathological response was evaluated according to the Chevallier 's classification . At a median follow-up of 8 years , survival was analysed as a function of pathological response . A pathological complete response rate was obtained in 60 patients ( 15.2 % ) after induction chemotherapy . Breast tumour persistence was significantly related to positive axillary nodes ( P=5.10−6 ) . At 15 years , overall survival and disease-free survival rates were significantly higher in the group who had a pathological complete response than in the group who had less than a pathological complete response ( P=0.047 and P=0.024 , respectively ) . In the absence of pathological complete response and furthermore when there is a notable remaining pathological disease , axillary dissection is still important to determine a major prognostic factor and subsequently , a second non cross resistant adjuvant regimen or high dose chemotherapy could lead to a survival benefit On June 20 , 2006 , the U.S. Food and Drug Administration ( FDA ) approved bevacizumab ( Avastin ; Genentech , Inc. , South San Francisco , CA ) , administered in combination with FOLFOX4 ( 5-fluorouracil , leucovorin , and oxaliplatin ) for the second-line treatment of metastatic carcinoma of the colon or rectum . Efficacy and safety were demonstrated in one Eastern Cooperative Oncology Group ( ECOG ) open-label , multicenter , r and omized , three-arm , active-controlled trial enrolling 829 adult patients . Patients had received a fluoropyrimidine- and irinotecan-based regimen as initial therapy for metastatic disease ; or they had received prior adjuvant irinotecan-based chemotherapy and had recurred within 6 months of completing therapy . Treatments included bevacizumab , 10 mg/kg , as a 90-minute i.v . infusion on day 1 , every 2 weeks , either alone or in combination with FOLFOX4 , or FOLFOX4 alone . The bevacizumab monotherapy arm was closed to accrual after an interim efficacy analysis suggested a possibly shorter survival in that arm . Overall survival ( OS ) , the primary study endpoint , was significantly longer for patients receiving bevacizumab in combination with FOLFOX4 than for those receiving FOLFOX4 alone . The objective response rate was significantly higher in the FOLFOX4 plus bevacizumab arm than in the FOLFOX4 alone arm . The duration of response was approximately 6 months for both treatment arms . Patients treated with the bevacizumab combination were also reported , based on investigator assessment , to have significantly longer progression-free survival . There were no new bevacizumab safety signals . The most serious , and sometimes fatal , bevacizumab toxicities are gastrointestinal perforation , wound-healing complications , hemorrhage , arterial thromboembolic events , hypertensive crisis , nephrotic syndrome , and congestive heart failure PURPOSE The advantage of chemotherapy in asymptomatic patients with advanced colorectal cancer is debatable . Whether early chemotherapy improves survival and the length of the symptom-free period versus no therapy until symptoms appear was studied in a r and omized trial . PATIENTS AND METHODS A total of 183 patients with advanced , but asymptomatic colorectal cancer were r and omly allocated to receive either initial treatment with sequential methotrexate 250 mg/m2 during the first 2 hours , and fluorouracil ( 5-FU ) 500 mg/m2 at hours 3 and 23 followed by leucovorin rescue initiated at hour 24 ( MFL ) for 12 courses or to primary expectancy with chemotherapy not considered until symptoms appeared . One patient was ineligible and excluded from analysis . Nine patients did not fulfill the inclusion criteria and five patients refused treatment allocation ; these patients were not excluded from the study population so as not to introduce bias . So far , 51 of 90 ( 60 % ) patients in the expectancy group have received chemotherapy . RESULTS Overall survival was better in the MFL group than in the expectancy group ( Breslow-Gehan , P less than .02 ; log-rank , P = .13 ) with a difference in median survival of approximately 5 months . Also the symptom-free period and the time to disease progression were longer in the MFL group ( P less than .001 ) , with median differences of 8 and 4 months , respectively . Toxicity to MFL treatment was low ; however , three patients died because of toxicity -- none of them should have received therapy because of poor performance or S-creatinine elevation . The patients maintained an excellent performance throughout the MFL treatment unless the disease was progressive . CONCLUSION We concluded that early treatment with MFL in asymptomatic patients with advanced colorectal cancer prolongs survival , the asymptomatic period , and the time to disease progression by approximately 6 months over primary expectancy PURPOSE In this report , we up date survival ( OS ) and time-to-progression ( TTP ) data for the Intergroup trial N9741 after a median 5 years of follow-up by using risk-stratified and prognostic factor analyses to determine if treatment outcomes differ in specific patient subgroups . PATIENTS AND METHODS A total of 1,691 patients were r and omly assigned to one of seven fluorouracil- , oxaliplatin- , and irinotecan-containing regimens . OS and TTP were calculated by treatment arm and baseline risk group ( on the basis of WBC , performance status , number of sites of disease , and alkaline phosphatase ) . Multivariate prognostic factor analysis was used to assess clinical factors for their relationships to OS , TTP , response , and toxicity by using Cox and logistic regression models . RESULTS The observed 5-year survival with infusional fluorouracil , leucovorin , and oxaliplatin ( FOLFOX ) of 9.8 % was better than with irinotecan plus bolus fluorouracil and leucovorin ( IFL ; 3.7 % ; P = .04 ) or with bolus irinotecan/oxaliplatin ( IROX ; 5.1 % ; P = .128 ) . OS and TTP were significantly longer for FOLFOX ( 20.2 months and 8.9 months , respectively ) than for IFL ( 14.6 months and 6.1 months , respectively ; P < .001 for both ) or for IROX ( 17.3 months and 6.7 months , respectively ; P < .001 for both ) . OS differed by risk group : 20.7 months for low risk , 17.4 months for intermediate risk , and 9.4 months for high risk ( P < .001 ) . FOLFOX treatment was superior in all risk groups and was the most powerful prognostic factor for OS , TTP , response rate , and toxicity . CONCLUSION The 9.8 % 5-year OS in patients with metastatic colorectal cancer who were treated with first-line FOLFOX sets a new benchmark . Neither baseline risk group nor any prognostic factor examined was predictive of treatment-specific outcome . However , treatment efficacy and patient longevity varied as a function of risk group
11,058
28,913,413
Conclusions While OKC and CKC may be beneficial during ACL surgical rehabilitation , there is insufficient evidence to suggest that either one is superior to the other
Background There is no consensus on whether closed kinetic chain ( CKC ) or open kinetic chain ( OKC ) exercises should be the intervention of choice following an anterior cruciate ligament ( ACL ) injury or reconstruction .
PURPOSE OF THE STUDY The aim of this work was to assess the usefulness of specialized rehabilitation sessions after anterior cruciate ligament reconstruction in high-level athletes participating in regional- , national- , or international-level sports . MATERIAL AND METHODS We conducted a retrospective comparison between two population s : in the first , rehabilitation was limited to the early postoperative period ( PO ) ; in the second , rehabilitation sessions were repeated , once in the early postoperative period and again 90 days later ( POR ) . The second rehabilitation period lasted two or three weeks and included an open and closed-chain muscle training program controlled by physical therapists using isokinetic devices . The sessions also included proprioceptive exercises and cardiovascular exercise reconditioning with a therapeutic project developed by the Physical Medicine and Rehabilitation physician in collaboration with other health professionals and taking into consideration the surgical recommendations and the patient 's clinical status . A question naire was addressed to all patients one year after the operation . The response rate was 55 % . Two groups were created at r and om : 74 patients were in the PO group and 75 in the POR group . The two population s were comparable in terms of gender ( 64 men , 10 women in PO versus 57 men and 18 women in POR ) , sports level ( regional , national , international ) , type of surgery ( 41 hamsting , 33 bone-patellar tendon-bone in PO and 43 hamstring and 32 bone-patellar tendon-bone in POR ) and sports ( generally rugby , soccer , h and ball or ski ) . RESULTS Our results were statistically in favor of the group POR in terms of resumption of sports activities : time to return to training ( 7.6 versus 8.7 months , p=0.03 ) and time to return to competition ( 9.06 versus 10.84 months , p=0.007 ) . They were also in favor of the group POR for resumption of sports activities at the former level ( 52.05 % versus 19.44 % , p=0.001 ) , pain ( numerical scale : 1.52 versus 2.03 , p=0.021 ) and subjective impression ( IKDC subjective score : 87.58 % versus 81.64 % , p=0.003 ) . There was no significant difference for resumption of training ( 90.50 % versus 81.69 % , p=0.2 ) or competition ( 71.80 % versus 56.76 % , p=0.1 ) at one year , but the results were close to the level of significance . DISCUSSION Adapted preparation before returning to sports activities using a scheme elaborated by specialists ( physicians , physical therapists , trainers ) provided a statistically significant improvement in time and quality of resumed activity ( return to prior level , pain , subjective IKDC score ) in this specific population of high-level competition athletes . A larger series would be interesting to check the statistical significance of return to training and competition at one year . Another follow-up beyond one year ( 1.5 or two years for example ) would also provide information on the percentage of definitive return to sports in these two population Introduction : Several established tools are available to assess study quality and reporting of r and omized controlled trials ; however , these tools were design ed with clinical intervention trials in mind . In exercise training intervention trials some of the traditional study quality criteria , such as participant or research er blinding , are extremely difficult to implement . Methods : We developed the Tool for the assEssment of Study qualiTy and reporting in EXercise ( TESTEX ) – a study quality and reporting assessment tool , design ed specifically for use in exercise training studies . Our tool is a 15-point scale ( 5 points for study quality and 10 points for reporting ) and addresses previously unmentioned quality assessment criteria specific to exercise training studies . Results : There were no systematic differences between the summated TESTEX scores of each observer [ H(2 ) = 0.392 , P = 0.822 ] . There was a significant association between the summated TESTEX scores of the three observers , with almost perfect agreement between observers 1 and 2 [ intra-class correlation coefficient ( ICC ) = 0.93 , 95 % confidence interval ( CI ) 0.82–0.97 , P < 0.001 ] , observers 1 and 3 ( ICC = 0.96 , 95 % CI 0.89–0.98 , P < 0.001 ) and observers 2 and 3 ( ICC = 0.91 , 95 % CI 0.75–0.96 , P < 0.001 ) . Conclusions : The TESTEX scale is a new , reliable tool , specific to exercise scientists , that facilitates a comprehensive review of exercise training trials A more sports-specific and detailed strength assessment has been advocated for patients after anterior cruciate ligament ( ACL ) injury and reconstruction . The purpose of this study was to develop a test battery of lower extremity strength tests with high ability to discriminate between leg power development on the injured and uninjured sides in patients after ACL injury and in patients who have undergone ACL reconstruction . Twenty-three patients were tested 6 months after ACL injury and 44 patients were tested 6 months after ACL reconstruction . Twenty-four of the 44 patients were operated on using a hamstrings graft and 20 patients were operated on using a patellar tendon graft . All the patients performed a test battery of three strength tests for each leg in a r and omised order . The three strength tests were chosen to reflect quadriceps and hamstring muscular power in a knee-extension and a knee-flexion test ( open kinetic chain ) and lower-extremity muscular power in a leg-press test ( closed kinetic chain ) . There was a higher sensitivity for the test battery to discriminate abnormal leg power compared with any of the three strength tests individually . Nine out of ten patients after ACL reconstruction and six out of ten of the patients after ACL injury exhibited abnormal leg power symmetry using the test battery . Thus , this test battery had high ability in terms of discriminating between the leg power performance on the injured and uninjured side , both in patients with an ACL injury and in patients who have undergone ACL reconstruction . It is concluded that a test battery consisting of a knee-extension , knee-flexion and leg-press muscle power test had high ability to determine deficits in leg power 6 months after ACL injury and reconstruction . Only a minority of the patients had restored leg muscle power . The clinical relevance is that the test battery may contribute to the decision-making process when deciding whether and when patients can safely return to strenuous physical activities after an ACL injury or reconstruction [ Purpose ] To compare outcomes of anterior cruciate ligament ( ACL ) reconstruction after open kinetic chain ( OKC ) exercises and closed kinetic chain ( CKC ) exercises . [ Subjects and Methods ] The subjects comprised 11 female and 47 male patients who are r and omly divided into two groups : which performed a CKC exercise program Group I and Group II which performed an OKC exercise program . Pain intensity was evaluated using visual analogue scale ( VAS ) . Knee flexion was evaluated using a universal goniometer , and thigh circumference measurements were taken with a tape measure at baseline and at 3 months and 6 months after the treatment . Lysholm scores were used to assess knee function . [ Results ] There were no significant differences between the two groups at baseline . Within each group , VAS values and knee flexion were improved after the surgery . These improvements were significantly higher in the CKC group than in the OKC group . There were increases in thigh circumference difference at the 3 and 6 month assessment s post-surgery . A greater improvement in the Lysholm score was observed in the CKC group at 6 months . [ Conclusion ] The CKC exercise program was more effective than OKC in improving the knee functions of patients with ACL reconstruction BACKGROUND AND PURPOSE Nondistally fixated ( ie , what is often referred to as " open kinetic chain " [ OKC ] ) knee extensor resistance training appears to have lost favor for some forms of rehabilitation due partly to concerns that this exercise will irritate the extensor mechanism . In this r and omized , single-blind clinical trial , nondistally fixated versus distally fixated ( ie , often called " closed kinetic chain " [ CKC ] ) leg extensor training were compared for their effects on knee pain . SUBJECTS Forty-three patients recovering from anterior cruciate ligament ( ACL ) reconstruction surgery ( 34 male , 9 female ; mean age=29 years , SD=7.9 , range=16 - 54 ) . METHODS Knee pain was measured at 2 and 6 weeks after ACL reconstruction surgery using visual analog scales in a self- assessment question naire and during maximal isometric contractions of the knee extensors . Between test sessions , subjects trained 3 times per week using either OKC or CKC resistance of their knee and hip extensors as part of their physical therapy . RESULTS No differences in knee pain were found between the treatment groups . DISCUSSION AND CONCLUSION Open kinetic chain and CKC leg extensor training in the early period after ACL reconstruction surgery do not differ in their immediate effects on anterior knee pain . Based on these findings , further studies are needed using different exercise dosages and patient groups BACKGROUND Subjective evaluation by patients has recently become an important adjunct to postoperative clinical assessment . Apart from st and ard physical examination , scales demonstrating patients ’ subjective outcome measures are used for assessing the efficacy of the treatment and rehabilitation of knee ligamentous injuries . The present work presents patients ’ subjective assessment of rehabilitation protocol s after ACL reconstruction . MATERIAL AND METHODS Forty individuals who had undergone ACL reconstruction were r and omised into two groups ( G1 , G2 ) and followed one of two rehabilitation protocol s ( A or B ) . The subjects assessed their knee function at baseline and after physical therapy using the Lysholm score and the IKDC form . The results were analysed with the Mann-Whitney U test . The subjects also completed a question naire at discharge . RESULTS Analysis of Lysholm and IKDC scores revealed significant differences between the mean pre- and postrehabilitation results in the groups ( p<0.05 ) . The greatest improvement was seen in G2 patients rehabilitated with protocol B , with significant mean improvements of 56.3 % and 46.7 % for the Lysholm and IKDC scores , respectively . Group G1 registered only half of this magnitude of change . Protocol B was also highly rated in the question naires . CONCLUSION According the patients opinion a rehabilitation protocol involving a larger number of open kinetic chain exercises may prove more effective in the rehabilitation of patients after ACL reconstruction compared to a programme basing mainly on closed kinetic chains The goal of this prospect i ve study was to determine the outcome -predictive role of various parameters in the nonoperative treatment of chronic anterior knee pain patients . Thirty patients followed a five-week treatment program , which consisted out of only closed kinetic chain exercises . Prior to this treatment all subjects were evaluated on muscular characteristics , subjective symptoms , weight , sex , duration of symptoms and functional performance . A multiple stepwise regression analysis revealed that the reflex response time of m. vastus medialis obliquus ( VMO ) ( P=0.041 ; 0.026 ) , and the duration of symptoms ( P=0.019 ; 0.045 ) were the only two parameters which were significantly associated with the outcome ( evaluated by the Kujala score ) at five weeks , and at three months . The shorter the duration of symptoms , or the faster the reflex response time of VMO prior to the treatment , the better the outcome after a closed kinetic chain exercise program . The statistical significance of these parameters in this study may be seen as an indication of the importance of these variables as predictors of the outcome of a closed kinetic chain strengthening program . Using this information , it seems clinical ly important to begin the treatment program before the anterior knee pain becomes more chronic and treatment results become less good We compared the diagnostic and predictive value of magnetic resonance imaging ( MRI ) and clinical findings with arthroscopy in 61 knees in a prospect i ve study . In meniscal tears , the accuracy and positive predictive value of MRI was found to be nearly twice that of clinical examination . The sensitivity , specificity , and negative predictive value of MRI were comparable to the figures found in other studies . We recommend MRI as a clarifying diagnostic tool when a clinical examination indicates a lesion of the meniscus . In our study , the clinical relevance of MRI in anterior cruciate ligament lesions and especially in cartilage lesions was more doubtful . The combination of clinical and MRI findings would reduce the number of blank arthroscopies to 5 % . MRI is a valuable diagnostic tool in planning the type of anesthesia and treatment , and could significantly reduce the need for a second arthroscopy Open kinetic chain ( OKC ) knee extensor resistance training has lost favour in ACLR rehabilitation due to concerns that this exercise is harmful to the graft and will be less effective in improving function . In this r and omized , single-blind clinical trial OKC and closed kinetic chain ( CKC ) knee extensor training were compared for their effects on knee laxity and function in the middle period of ACLR rehabilitation . The study subjects were 49 patients recovering from ACLR surgery ( 37 M , 12 F ; mean age=33 years ) . Tests were carried out at 8 and 14 weeks after ACLR with knee laxity measured using a ligament arthrometer and function with the Hughston Clinic knee self- assessment question naire and single leg , maximal effort jump testing ( post-test only ) . Between tests , subjects trained using either OKC or CKC resistance of their knee and hip extensors as part of formal physical therapy sessions three times per week . No statistically significant ( one-way ANOVA , p>0.05 ) differences were found between the treatment groups in knee laxity or leg function . OKC and CKC knee extensor training in the middle period of rehabilitation after ACLR surgery do not differ in their effects on knee laxity or leg function . Exercise dosages are described in this study and further research is required to assess whether the findings in this study are dosage specific Abstract . Knee extensor resistance training using open kinetic chain ( OKC ) exercise for patients recovering from anterior cruciate ligament reconstruction ( ACLR ) surgery has lost favour mainly because of research indicating that OKC exercise causes greater ACL strain than closed kinetic chain ( CKC ) exercise . In this prospect i ve , r and omized clinical trial the effects of these two regimes on knee laxity were compared in the early period after ACLR surgery . Thirty-six patients recovering from ACLR surgery ( 29 males , 7 females ; age mean=30 ) were tested at 2 and 6 weeks after ACLR with knee laxity measured using the Knee Signature System arthrometer . Between tests subjects trained using either OKC or CKC resistance of their knee and hip extensors in formal physical therapy sessions three times per week . Following adjustment for site of treatment , pretraining injured knee laxity , and untreated knee laxity at post-training , the use of OKC exercise , when compared to CKC exercise , was found to lead to a 9 % increase in looseness with a 95 % confidence interval of –8 % to + 29 % . These results indicate that the great concern about the safety of OKC knee extensor training in the early period after ACLR surgery may not be well founded BACKGROUND There is no consensus among the existing published evidence as to whether closed kinetic chain ( CKC ) or open kinetic chain ( OKC ) exercises should be the intervention of choice following an anterior cruciate ligament ( ACL ) injury or reconstruction . The commonly held belief has been that OKC exercises cause increased strain on the ACL as well as increased joint laxity and anterior tibial translation . OBJECTIVE To investigate the effects of OKC and CKC exercises on the knees of patients with ACL deficiency or reconstruction . DATA SOURCES MEDLINE , ProQuest Medical Library , and CINAHL STUDY SELECTION : Six articles were chosen for inclusion in the systematic review . The authors narrowed 50 articles down to 6 by review of titles and abstract s. Included articles were r and omized controlled trials written in English , published during 2000 - 2008 , that evaluated the effects of OKC and CKC exercises on ACL deficient or reconstructed knees . DATA EXTRACTION Quality of the included studies was defined by the PEDro scale(1 ) , which has been found to be reliable.(2 ) DATA SYNTHESIS Scores on the PEDro scale(1 ) ranged from 4 - 6/10 . One article found positive significant effects with inclusion of OKC exercises in the rehabilitation program and another found significant benefits with combining OKC and CKC exercises . CKC exercises alone were not found by any studies to be superior to OKC exercises . CONCLUSION These studies reveal favorable results for utilization of both open and closed kinetic chain exercises for intervention with ACL deficient or reconstructed knees . However , further research needs to be completed Abstract . Rehabilitation after anterior cruciate ligament ( ACL ) reconstruction has focused over the past decade on closed kinetic chain ( CKC ) exercises due to presumably less strain on the graft than with isokinetic open kinetic chain exercises ( OKC ) ; however , recent reports suggest that there are only minor differences in ACL strain values between some CKC and OKC exercises . We studied anterior knee laxity , thigh muscle torque , and return to preinjury sports level in 44 patients with unilateral ACL ; group 1 carried out quadriceps strengthening only with CKC while group 2 trained with CKC plus OKC exercises starting from week 6 after surgery . Anterior knee laxity was determined with a KT-1000 arthrometer ; isokinetic concentric and eccentric quadriceps and hamstring muscle torque were studied with a Kin-Com dynamometer before and 6 months after surgery . At an average of 31 months after surgery the patients answered a question naire regarding their current knee function and physical activity/sports to determine the extent and timing of their recovery . No significant differences in anterior knee laxity were noted between the groups 6 months postsurgery . Patients in group 2 increased their quadriceps torque significantly more than those in group 1 , but no differences were found in hamstring torque between the groups . A significantly higher number of patients in group 2 ( n=12 ) than in group 1 ( n=5 ) returned to sports at the same level as before the injury ( P<0.05 ) . Patients from group 2 who returned to sports at the same level did so 2 months earlier than those in group 1 . Thus the addition of OKC quadriceps training after ACL reconstruction results in a significantly better improvement in quadriceps torque without reducing knee joint stability at 6 months and also leads to a significantly higher number of athletes returning to their previous activity earlier and at the same level as before injury We conducted a prospect i ve , r and omized study of open and closed kinetic chain exercises during accelerated rehabilitation after anterior cruciate ligament recon struction to determine if closed kinetic chain exercises are safe and if they offer any advantages over conven tional rehabilitation . The closed kinetic chain group used a length of elastic tubing , the Sport Cord , to per form weightbearing exercises and the open kinetic chain group used conventional physical therapy equip ment . Results are reported with a minimum 1-year fol lowup ( mean , 19 months ) . Pre- and postoperative evaluation included the Lysholm knee function scoring scale , Tegner activity rating scale and KT-1000 ar thrometer measurements . Overall , stability was re stored in over 90 % of the knees . Preoperative patel lofemoral pain was reduced significantly ; 95 % of the patients had a full range of motion . The closed kinetic chain group had lower mean KT-1 000 arthrometer side- to-side differences , less patellofemoral pain , was gen erally more satisfied with the end result , and more often thought they returned to normal daily activities and sports sooner than expected . We concluded that closed kinetic chain exercises are safe and effective and offer some important advantages over open kinetic chain ex ercises . As a result of this study , we now use the closed kinetic chain protocol exclusively after anterior cruciate ligament reconstruction
11,059
30,526,170
The meta- analysis indicated that vibration significantly improved the VAS at 24 , 48 , and 72 hours after exercise , and significantly improved CK levels at 24 and 48 hours , but not at 72 hours . Conclusion Vibration is a beneficial and useful form of physiotherapy for alleviating DOMS .
Objective Delayed-onset muscle soreness ( DOMS ) is a symptom of exercise-induced muscle injury that is commonly encountered in athletes and fitness enthusiasts . Vibration is being increasingly used to prevent or treat DOMS . We therefore carried out a meta- analysis to evaluate the effectiveness of vibration in patients with DOMS .
INTRODUCTION The most popular method of stretching is static stretching . Vibration may provide a means of enhancing range of motion beyond that of static stretching alone . PURPOSE This study sought to observe the effects of vibration on static stretching to determine whether vibration-aided static stretching could enhance range of motion acquisition more than static stretching alone in the forward split position . METHODS Ten highly trained male volunteer gymnasts were r and omly assigned to experimental ( N = 5 ) and control ( N = 5 ) groups . The test was a forward split with the rear knee flexed to prevent pelvic misalignment . Height of the anterior iliac spine of the pelvis was measured at the lowest split position . Athletes stretched forward and rearward legs to the point of discomfort for 10 s followed by 5 s of rest , repeated four times on each leg and split position ( 4 min total ) . The experimental group stretched with the device turned on ; the control group stretched with the device turned off . A pretest was followed by an acute phase posttest , then a second posttest measurement was performed following 4 wk of treatment . Difference scores were analyzed . RESULTS The acute phase showed dramatic increases in forward split flexibility for both legs ( P < 0.05 ) , whereas the long-term test showed a statistically significant increase in range of motion on the right rear leg split only ( P < 0.05 ) . Effect sizes indicated large effects in all cases . CONCLUSION This study showed that vibration can be a promising means of increasing range of motion beyond that obtained with static stretching in highly trained male gymnasts The effects of hard squatting exercise with ( VbX+ ) and without ( VbX− ) vibration on neuromuscular function were tested in 19 healthy young volunteers . Before and after the exercise , three different tests were performed : maximum serial jumping for 30 s , electromyography during isometric knee extension at 70 % of the maximum voluntary torque , and the quantitative analysis of the patellar tendon reflex . Between VbX+ and VbX− values , there was no difference found under baseline conditions . Time to exhaustion was significantly shorter in VbX+ than in VbX− ( 349 ± 338 s versus 515 ± 338 s ) , but blood lactate ( 5·49 ± 2·73 mmol l−1 versus 5·00 ± 2·26 mmol l−1 ) and subjectively perceived exertion ( rate of perceived exertion values 18·1 ± 1·2 versus 18·6 ± 1·6 ) at the termination of exercise indicate comparable levels of fatigue . After the exercise , comparable effects were observed on jump height , ground contact time , and isometric torque . The vastus lateralis mean frequency during isometric torque , however , was higher after VbX+ than after VbX−. Likewise , the tendon reflex amplitude was significantly greater after VbX+ than after VbX− ( 4·34 ± 3·63 Nm versus 1·68 ± 1·32 Nm ) . It is followed that in exercise unto comparable degrees of exhaustion and muscular fatigue , superimposed 26 Hz vibration appears to elicit an alteration in neuromuscular recruitment patterns , which apparently enhance neuromuscular excitability . Possibly , this effect may be exploited for the design of future training regimes Purpose To examine the acute and short-term effect of a wearable vibration device following strenuous eccentric exercise of the elbow flexors . Methods Physically active males ( n = 13 ) performed vibration therapy ( VT ) and control following eccentric exercise . The arms were r and omised and counterbalanced , separated by 14 days . 15 min of VT ( 120 Hz ) was applied immediately and 24 , 48 , and 72 h after eccentric exercise while the contralateral arm performed no VT ( control ) . Muscle ( isometric and concentric ) strength , range of motion , electromyography ( EMG ) , muscle soreness and creatine kinase were taken pre-exercise , immediately and 24 , 48 , and 72 h post-eccentric exercise . Additionally , the acute effect of VT of muscle strength , range of motion , EMG , muscle soreness was also investigated immediately after VT . Results In the short-term VT was able to significantly reduce the level of biceps brachii pain at 24 h ( p < 0.05 ) and 72 h ( p < 0.01 ) , enhance pain threshold at 48 h ( p < 0.01 ) and 72 h ( p < 0.01 ) , improve range of motion at 24 h ( p < 0.05 ) , 48 h ( p < 0.01 ) and 72 h ( p < 0.01 ) and significantly ( p < 0.05 ) reduced creatine kinase at 72 h compared to control . Acutely , following VT treatment muscle pain and range of motion significantly improved ( p < 0.05 ) at 24 h post , 48 h post , and 72 h post but no significant changes in muscle strength and EMG were reported acutely or short-term . Conclusions Acute and short-term VT attenuated muscle soreness , creatine kinase and improved range of motion ; however , there was no improvement of muscle strength recovery compared to control following eccentric exercise of the elbow flexors Abstract The aim of this study was to assess the effects of a bout of whole body vibration ( WBV ) on muscle response and to determine whether this stimulus leads to muscle damage . Thirty healthy and physically active participants ( mean±SD ; age : 21.8±2.0 years ; height : 176.7±5.8 cm ; body mass : 76±6.8 kg and BMI : 23.1±3.7 kg·m−2 ) participated in this study . Participants were r and omly allocated in one of two groups , one of them performed a bout of 360 s WBV ( frequency : 30 Hz ; peak-to-peak displacement : 4 mm ) ( VIB ) and the other one adopted a sham position ( CON ) . Muscle contractile properties were analysed in the rectus femoris ( RF ) by using tensiomyography ( TMG ) 2 min before the warm-up and 2 min after intervention . Muscle damage was assessed by determining plasma creatine kinase ( CK ) and lactate dehydrogenase ( LDH ) levels at three time points ; 5 min before warm-up and 1 h and 48 h after the intervention . TMG results showed a significant decrease in maximal displacement ( p<0.05 ) and delay time ( p<0.05 ) in VIB and in delay time ( p<0.05 ) and relaxation time ( p<0.05 ) in CON . Muscle damage markers showed significant group differences ( p<0.05 ) for CK 1 h after the intervention . In addition , differences for CK 1 h after the intervention from baseline ( p<0.05 ) were also observed in VIB . In conclusion , a 6-min bout of WBV results in an increase of muscle stiffness in RF and increased CK levels 1 h after intervention ( returning to baseline within 48 h ) Objective : The purpose of this study was to determine if vibration therapy is more effective than the st and ard treatment of stretching and massage for improving recovery of muscle strength and reducing muscle soreness after muscle damage induced by eccentric exercise . Design : A r and omized , single-blinded parallel intervention trial design was used . Setting : Research laboratory . Participants : Fifty untrained men aged 18 to 30 years completed the study . Interventions : Participants performed 100 maximal eccentric muscle actions ( ECCmax ) of the right knee extensor muscles . For the next 7 days , 25 participants applied cycloidal vibration therapy to the knee extensors twice daily and 25 participants performed stretching and sports massage ( SSM ) twice daily . Main Outcome Measures : Changes in markers of muscle damage [ peak isometric torque ( PIT ) , serum creatine kinase ( CK ) , and serum myoglobin ( Mb ) ] , muscle soreness ( visual analog scale ) , and inflammation [ serum C-reactive protein ( CRP ) ] were assessed . Results : After ECCmax , there was no difference in recovery of PIT and muscle soreness or serum CK , Mb , and CRP levels between vibration and SSM groups ( P > 0.28 ) . Conclusions : Cycloidal vibration therapy is no more effective than the st and ard practice of stretching and massage to promote muscle recovery after the performance of muscle-damaging exercise . Clinical Relevance : Prescription of vibration therapy after maximal exercise involving eccentric muscle damage did not alleviate signs and symptoms of muscle damage faster than the st and ard prescription of stretching and massage [ Purpose ] To investigate the effects of pre-induced muscle damage vibration stimulation on the pressure-pain threshold and muscle-fatigue-related metabolites of exercise-induced muscle damage . [ Subjects and Methods ] Thirty healthy , adult male subjects were r and omly assigned to the pre-induced muscle damage vibration stimulation group , post-induced muscle damage vibration stimulation group , or control group ( n=10 per group ) . To investigate the effects of pre-induced muscle damage vibration stimulation , changes in the pressure-pain threshold ( lb ) , creatine kinase level ( U/L ) , and lactate dehydrogenase level ( U/L ) were measured and analyzed at baseline and at 24 hours , 48 hours , and 72 hours after exercise . [ Results ] The pressure-pain thresholds and concentrations of creatine kinase and lactate dehydrogenase varied significantly in each group and during each measurement period . There were interactions between the measurement periods and groups , and results of the post-hoc test showed that the pre-induced muscle damage vibration stimulation group had the highest efficacy among the groups . [ Conclusion ] Pre-induced muscle damage vibration stimulation is more effective than post-induced muscle damage vibration stimulation for preventing muscle damage Objective In this study , the effects of vibration therapy ( VT ) on delayed-onset muscle soreness ( DOMS ) and associated inflammatory markers after downhill running were determined . Methods 29 male recreational runners ( 33 ( 8) years ; Vo2peak 57 ( 6 ) ml kg−1 min−1 ) completed a 40-min downhill run and were r and omly allocated to a VT group or Control group . For 5 days post-run , the VT group underwent once-daily sessions of VT on the upper and lower legs . DOMS was assessed pre-run and for 5 days post-run by visual analogue scale . Immune cell subsets and plasma inflammatory markers were assessed pre-run , post-run , 24 and 120 h post-run by full differential cell count , and by ELISA and enzyme immunoassay , respectively . Data were analysed as per cent change from pre-run ( ANOVA ) and the magnitude of the treatment effect ( Cohen 's effect size statistics ) . Results VT significantly reduced calf pain 96 h post-run ( −50 % ( 40 % ) , 90 % confidence limits ) and gluteal pain 96 h ( −50 % ( 40 % ) ) and 120 h post-run ( −30 % ( 30 % ) ) ; decreased interleukin 6 ( IL6 ) 24 h ( −46 % ( 31 % ) ) and 120 h post-run ( −65 % ( 30 % ) ) ; substantially decreased histamine 24 h ( −40 % ( 50 % ) ) and 120 h post-run ( −37 % ( 48 % ) ) ; substantially increased neutrophils ( 8.6 % ( 8.1 % ) ) and significantly decreased lymphocytes ( −17 % ( 12 % ) ) 24 h post-run . There were no clear substantial effects of VT on other leukocyte subsets and inflammatory markers . Conclusion VT reduces muscle soreness and IL6 . It may stimulate lymphocyte and neutrophil responses and may be a useful modality in treating muscle inflammation The purpose of this study was to investigate the effects of a single bout of whole-body vibration on isometric squat ( IS ) and countermovement jump ( CMJ ) performance . Nine moderately resistance-trained men were tested for peak force ( PF ) during the IS and jump height ( JH ) and peak power ( PP ) during the CMJ . Average integrated electromyography ( IEMG ) was measured from the vastus medialis , vastus lateralis , and biceps femoris muscles . Subjects performed the 2 treatment conditions , vibration or sham , in a r and omized order . Subjects were tested for baseline performance variables in both the IS and CMJ , and were exposed to either a 30-second bout of whole-body vibration or sham intervention . Subjects were tested immediately following the vibration or sham treatment , as well as 5 , 15 , and 30 minutes posttreatment . Whole-body vibration result ed in a significantly higher ( p ≤ 0.05 ) JH during the CMJ immediately following vibration , as compared with the sham condition . No significant differences were observed in CMJ PP ; PF during IS or IEMG of the vastus medialis , vastus lateralis , or biceps femoris during the CMJ ; or IS between vibration and sham treaments . Whole-body vibration may be a potential warm-up procedure for increasing vertical JH . Future research is warranted addressing the influence of various protocol s of whole-body vibration ( i.e. , duration , amplitude , frequency ) on athletic performance CONTEXT Numerous recovery strategies have been used in an attempt to minimize the symptoms of delayed-onset muscle soreness ( DOMS ) . Whole-body vibration ( WBV ) has been suggested as a viable warm-up for athletes . However , scientific evidence to support the protective effects of WBV training ( WBVT ) on muscle damage is lacking . OBJECTIVE To investigate the acute effect of WBVT applied before eccentric exercise in the prevention of DOMS . DESIGN R and omized controlled trial . SETTING University laboratory . PATIENTS OR OTHER PARTICIPANTS A total of 32 healthy , untrained volunteers were r and omly assigned to either the WBVT ( n = 15 ) or control ( n = 17 ) group . INTERVENTION(S ) Volunteers performed 6 sets of 10 maximal isokinetic ( 60 ° /s ) eccentric contractions of the dominant-limb knee extensors on a dynamometer . In the WBVT group , the training was applied using a vibratory platform ( 35 Hz , 5 mm peak to peak ) with 100 ° of knee flexion for 60 seconds before eccentric exercise . No vibration was applied in the control group . MAIN OUTCOME MEASURE(S ) Muscle soreness , thigh circumference , and pressure pain threshold were recorded at baseline and at 1 , 2 , 3 , 4 , 7 , and 14 days postexercise . Maximal voluntary isometric and isokinetic knee extensor strength were assessed at baseline , immediately after exercise , and at 1 , 2 , 7 , and 14 days postexercise . Serum creatine kinase was measured at baseline and at 1 , 2 , and 7 days postexercise . RESULTS The WBVT group showed a reduction in DOMS symptoms in the form of less maximal isometric and isokinetic voluntary strength loss , lower creatine kinase levels , and less pressure pain threshold and muscle soreness ( P < .05 ) compared with the control group . However , no effect on thigh circumference was evident ( P < .05 ) . CONCLUSIONS Administered before eccentric exercise , WBVT may reduce DOMS via muscle function improvement . Further investigation should be undertaken to ascertain the effectiveness of WBVT in attenuating DOMS in athletes Abstract Wheeler , AA and Jacobson , BH . Effect of whole-body vibration on delayed onset muscular soreness , flexibility , and power . J Strength Cond Res 27(9 ) : 2527–2532 , 2013—Delayed onset muscle soreness ( DOMS ) occurs after unaccustomed or intense bouts of exercise . The effects of DOMS peak at approximately 48 hours postexercise , and DOMS is treated , albeit not highly successfully , in a variety of ways including the use of medication and therapeutic modalities . The objective of this study was to determine the effects of whole-body vibration ( WBV ) on DOMS through Visual Analog Scale ( VAS ) measures of perceived pain/soreness and to assess the effect of WBV on flexibility and explosive power after induced DOMS . Twenty healthy college-aged participants ( 10 men and 10 women ) volunteered for this study and were r and omly assigned to the experimental or the control group . Participants completed baseline measures for VAS , hamstring and lower back flexibility , and explosive power before completing a DOMS-inducing exercise . Measures for VAS , hamstring and lower back flexibility , and explosive power were measured immediately postexercise and again immediately posttreatment . Participants reported back to the laboratory for 4 additional data collection s sessions . Both the experimental and control groups yielded significant differences ( p < 0.05 ) in pretest and posttest DOMS between baseline and pretest and posttest 1 , pretest and posttest 2 , and pretest and posttest 3 . No significance ( p > 0.05 ) was found within or between groups when comparing pre assessment s and post assessment s of DOMS , flexibility , or explosive power . No differences ( p > 0.05 ) between WBV and light exercise were found for DOMS , flexibility , and explosive power . These results suggest that WBV is equally as effective as light exercise in reducing the severity of DOMS . Thus , WBV may be used as a recovery option in addition to current treatments OBJECTIVES To compare the effects of vibration therapy and massage in prevention of DOMS . METHODS Pre-test and Post-test Control-Group Design was used , 45 healthy female non athletic Subjects were recruited and r and omly distributed to the three groups ( 15 subject in each group ) . After the subject 's initial status was measured experimental groups received vibration therapy ( 50 Hz vibration for five minutes ) or massage therapy ( 15 minutes ) intervention and control group received no treatment , just prior to the eccentric exercise . Subjects were undergoing the following measurements to evaluate the changes in the muscle condition : muscle soreness ( pain perception ) , Range of Motion ( ROM ) , Maximum Isometric Force ( MIF ) , Repetition maximum ( RM ) , Lactate dehydrogenase ( LDH ) and Cretain Kinase ( CK ) level . All the parameters except LDH , CK and 1RM were measured before , immediately post intervention , immediately post exercise , 24 hours post exercise , 48 hours post exercise and 72 hours post exercise . LDH , CK and 1 RM were measured before and 48 hours post exercise . RESULT Muscle soreness was reported to be significantly less for experimental ( vibration and massage ) group ( p=0.000 ) as compared to control group at 24 , 48 , and 72 hours of post-exercise . Experimental and control group did not show any significant difference in MIF immediate ( p=0.2898 ) , 24 hours ( p=0.4173 ) , 48 hours ( p=0.752 ) and 72 hours ( p=0.5297 ) of post-exercise . Range of motion demonstrated significant recovery in experimental groups in 48 hours ( p=0.0016 ) and 72 hours ( p=0.0463 ) . Massage therapy showed significant recovery in 1RM ( p=0.000 ) compared to control group and vibration therapy shows significantly less LDH level ( p=0.000 ) 48 hours of post exercise compare to control group . CK at 48 hours of post exercise in vibration group ( p=0.000 ) and massage group showed ( p=0.002 ) significant difference as compared to control group . CONCLUSION Vibration therapy and massage are equally effective in prevention of DOMS . Massage is effective in restoration of concentric strength ( 1 RM ) . Yet vibration therapy shows clinical ly early reduction of pain and is effective in decreasing the level of LDH in 48 hours post exercise periods [ Purpose ] The aim of this study was to investigate whether or not a single whole-body vibration treatment after eccentric exercise can reduce muscle soreness and enhance muscle recovery . [ Subjects and Methods ] Twenty untrained participants were r and omly assigned to two groups : a vibration group ( n=10 ) and control group ( n=10 ) . Participants performed eccentric quadriceps training of 4 sets of 5 repetitions at 120 % 1RM , with 4 min rest between sets . After that , the vibration group received 3 sets of 1 min whole body vibration ( 12 Hz , 4 mm ) with 30 s of passive recovery between sets . Serum creatine kinase , blood urea nitrogen , muscle soreness ( visual analog scale ) and muscle strength ( peak isometric torque ) were assessed . [ Results ] Creatine kinase was lower in the vibration group than in the control group at 24 h ( 200.2 ± 8.2 vs. 300.5 ± 26.1 U/L ) and at 48 h ( 175.2 ± 12.5 vs. 285.2 ± 19.7 U/L ) post-exercise . Muscle soreness decreased in vibration group compared to control group at 48 h post-exercise ( 34.1 ± 11.4 vs. 65.2 ± 13.2 mm ) . [ Conclusion ] Single whole-body vibration treatment after eccentric exercise reduced delayed onset muscle soreness but it did not affect muscle strength recovery UNLABELLED Evidence suggests large diameter afferents , presumably in response to central ly mediated changes , augment the mechanical allodynia or hyperalgesia seen in delayed onset muscle soreness ( DOMS ) conditions . Healthy males aged 18 to 30 ( n = 16 ) performed eccentric exercise eliciting DOMS in the tibialis anterior muscle of a r and omly assigned exercised leg . The contralateral leg served as a control . Mechanosensitivity was assessed on the exercised and control legs prior to and 24 hours postexercise via pressure pain thresholds ( PPTs ) . PPTs were assessed at the muscle site , and at a distant segmentally related site , either without vibration or with vibration concurrently applied to the distant muscle , segmentally related , or control extra-segmentally related site . Participants completed a 6-point Likert scale providing a subjective measure of DOMS 5 days postexercise . Baseline mechanosensitivity was not significantly different at any site between the exercised and control legs prior to the exercise . Soreness ratings were higher 24 to 48 hours postexercise ( P < .05 ) , and baseline PPTs at the exercised legs muscle site decreased postexercise ( P < .001 ) . On day 1 following exercise , segmentally related site PPTs reduced significantly when vibration was applied concurrently to the DOMS affected tibialis anterior muscle ( P < .04 ) compared to baseline mechanosensitivity or extrasegmental control vibration . PERSPECTIVE Further evidence is presented by this article indicating that large diameter afferents , presumably via central ly mediated mechanisms , augment the mechanical hyperalgesia seen in DOMS conditions . Future research examining eccentric activity in individuals with likely central ly sensitized conditions may be warranted Delayed onset muscle soreness ( DOMS ) , which may occur after eccentric exercise , may cause some reduction in ability in sport activities . For this reason , several studies have been design ed on preventing and controlling DOMS . As vibration training ( VT ) may improve muscle performance , we design ed this study to investigate the effect of VT on controlling and preventing DOMS after eccentric exercise . Methods : Fifty healthy non-athletic volunteers were assigned r and omly into two experimental , VT ( n = 25 ) and non-VT ( n = 25 ) groups . A vibrator was used to apply 50 Hz vibration on the left and right quadriceps , hamstring and calf muscles for 1 min in the VT group , while no vibration was applied in the non-VT group . Then , both groups walked downhill on a 10 ° declined treadmill at a speed of 4 km/hour . The measurements included the isometric maximum voluntary contraction force ( IMVC ) of left and right quadriceps muscles , pressure pain threshold ( PPT ) 5 , 10 and 15 cm above the patella and mid-line of the calf muscles of both lower limbs before and the day after treadmill walking . After 24 hours , the serum levels of creatine-kinase ( CK ) , and DOMS level by visual analogue scale were measured . Results : The results showed decreased IMVC force ( P = 0.006 ) , reduced PPT ( P = 0.0001 ) and significantly increased mean of DOMS and CK levels in the non-VT group , compared to the VT group ( P = 0.001 ) . Conclusion : A comparison by experimental groups indicates that VT before eccentric exercise may prevent and control DOMS . Further studies should be undertaken to ascertain the stability and effectiveness of VT in athletics Exercise involving eccentric muscle contractions is known to decrease range of motion and increase passive muscle stiffness . This study aim ed at using ultrasound shear wave elastography to investigate acute changes in biceps brachii passive stiffness following intense barbell curl exercise involving both concentric and eccentric contractions . The effect of local vibration ( LV ) as a recovery modality from exercise-induced increased stiffness was further investigated . Eleven subjects performed 4 bouts of 10 bilateral barbell curl movements at 70 % of the one-rep maximal flexion force . An arm-to-arm comparison model was then used with one arm r and omly assigned to the passive recovery condition and the other arm assigned to the LV recovery condition ( 10 min of 55-Hz vibration frequency and 0.9-mm amplitude ) . Biceps brachii shear elastic modulus measurements were performed prior to exercise ( PRE ) , immediately after exercise ( POST-EX ) and 5 min after the recovery period ( POST-REC ) . Biceps brachii shear elastic modulus was significantly increased at POST-EX ( + 53 ± 48 % ; p < 0.001 ) and POST-REC ( + 31 ± 46 % ; p = 0.025 ) when compared to PRE . No differences were found between passive and LV recovery ( p = 0.210 ) . LV as a recovery strategy from exercise-induced increased muscle stiffness was not beneficial , probably due to an insufficient mechanical action of vibrations . Key pointsBouts of barbell curl exercise induce an immediate increased passive stiffness of the biceps brachii muscle , as evidence d by greater shear elastic modulus measured by supersonic shear imaging . The administration of a vibratory massage did not reduce this acute exercise-induced increased stiffness UNLABELLED High-frequency mechanical strain seems to stimulate bone strength in animals . In this r and omized controlled trial , hip BMD was measured in postmenopausal women after a 24-week whole body vibration ( WBV ) training program . Vibration training significantly increased BMD of the hip . These findings suggest that WBV training might be useful in the prevention of osteoporosis . INTRODUCTION High-frequency mechanical strain has been shown to stimulate bone strength in different animal models . However , the effects of vibration exercise on the human skeleton have rarely been studied . Particularly in postmenopausal women-who are most at risk of developing osteoporosis-r and omized controlled data on the safety and efficacy of vibration loading are lacking . The aim of this r and omized controlled trial was to assess the musculoskeletal effects of high-frequency loading by means of whole body vibration ( WBV ) in postmenopausal women . MATERIAL S AND METHODS Seventy volunteers ( age , 58 - 74 years ) were r and omly assigned to a whole body vibration training group ( WBV , n = 25 ) , a resistance training group ( RES , n = 22 ) , or a control group ( CON , n = 23 ) . The WBV group and the RES group trained three times weekly for 24 weeks . The WBV group performed static and dynamic knee-extensor exercises on a vibration platform ( 35 - 40 Hz , 2.28 - 5.09 g ) , which mechanically loaded the bone and evoked reflexive muscle contractions . The RES group trained knee extensors by dynamic leg press and leg extension exercises , increasing from low ( 20 RM ) to high ( 8 RM ) resistance . The CON group did not participate in any training . Hip bone density was measured using DXA at baseline and after the 6-month intervention . Isometric and dynamic strength were measured by means of a motor-driven dynamometer . Data were analyzed by means of repeated measures ANOVA . RESULTS No vibration-related side effects were observed . Vibration training improved isometric and dynamic muscle strength ( + 15 % and + 16 % , respectively ; p < 0.01 ) and also significantly increased BMD of the hip ( + 0.93 % , p < 0.05 ) . No changes in hip BMD were observed in women participating in resistance training or age-matched controls ( -0.60 % and -0.62 % , respectively ; not significant ) . Serum markers of bone turnover did not change in any of the groups . CONCLUSION These findings suggest that WBV training may be a feasible and effective way to modify well-recognized risk factors for falls and fractures in older women and support the need for further human studies Background : Delayed-onset muscle soreness ( DOMS ) is a common symptom in people participating in exercise , sport , or recreational physical activities . Several remedies have been proposed to prevent and alleviate DOMS . Design and Methods : A five-arm r and omized controlled study was conducted to examine the effects of acupuncture on eccentric exercise-induced DOMS of the biceps brachii muscle . Participants were recruited through convenience sampling of students and general public . Participants were r and omly allocated to needle , laser , sham needle , sham laser acupuncture , and no intervention . Outcome measures included pressure pain threshold ( PPT ) , pain intensity ( visual analog scale ) , and maximum isometric voluntary force . Results : Delayed-onset muscle soreness was induced in 60 participants ( 22 females , age 23.6 ± 2.8 years , weight 66.1 ± 9.6 kg , and height 171.6 ± 7.9 cm ) . Neither verum nor sham interventions significantly improved outcomes within 72 hours when compared with no treatment control ( P > 0.05 ) . Conclusions : Acupuncture was not effective in the treatment of DOMS . From a mechanistic point of view , these results have implication s for further studies : ( 1 ) considering the high-threshold mechanosensitive nociceptors of the muscle , the cutoff for PPT ( 5 kg/cm2 ) chosen to avoid bruising might have led to ceiling effects ; ( 2 ) the traditional acupuncture regimen , targeting muscle pain , might have been inappropriate as the DOMS mechanisms seem limited to the muscular unit and its innervation . Therefore , a regionally based regimen including an intensified intramuscular needling ( dry needling ) should be tested in future studies , using a higher cutoff for PPT to avoid ceiling effects CONTEXT Research into alleviating muscle pain and symptoms in individuals after delayed-onset muscle soreness ( DOMS ) has been inconsistent and unsuccessful in demonstrating a useful recovery modality . OBJECTIVE To investigate the effects of short-term whole-body vibration ( WBV ) on DOMS over a 72-hour period after a high-intensity exercise protocol . DESIGN R and omized controlled clinical trial . SETTING University laboratory . PATIENTS OR OTHER PARTICIPANTS Thirty women volunteered to participate in 4 testing sessions and were assigned r and omly to a WBV group ( n = 16 ; age = 21.0 ± 1.9 years , height = 164.86 ± 6.73 cm , mass = 58.58 ± 9.32 kg ) or a control group ( n = 14 ; age = 22.00 ± 1.97 years , height = 166.65 ± 8.04 cm , mass = 58.69 ± 12.92 kg ) . INTERVENTION(S ) Participants performed 4 sets to failure of single-legged split squats with 40 % of their body weight to induce muscle soreness in the quadriceps . The WBV or control treatment was administered each day after DOMS . MAIN OUTCOME MEASURE(S ) Unilateral pressure-pain threshold ( PPT ) , range of motion ( ROM ) , thigh circumference , and muscle-pain ratings of the quadriceps were collected before and for 3 days after high-intensity exercise . Each day , we collected 3 sets of measures , consisting of 1 measure before the WBV or control treatment protocol ( pretreatment ) and 2 sets of posttreatment measures . RESULTS We observed no interactions for PPT , thigh circumference , and muscle pain ( P > .05 ) . An interaction was found for active ROM ( P = .01 ) , with the baseline pretreatment measure greater than the measures at baseline posttreatment 1 through 48 hours posttreatment 2 in the WBV group . For PPT , a main effect for time was revealed ( P < .05 ) , with the measure at baseline pretreatment greater than at 24 hours pretreatment and all other time points for the vastus medialis , greater than 24 hours pretreatment through 48 hours posttreatment 2 for the vastus lateralis , and greater than 24 hours pretreatment and 48 hours pretreatment for the rectus femoris . For dynamic muscle pain , we observed a main effect for time ( P < .001 ) , with the baseline pretreatment measure less than the measures at all other time points . No main effect for time was noted for thigh circumference ( P = .24 ) . No main effect for group was found for any variable ( P > .05 ) . CONCLUSIONS The WBV treatment approach studied did not aid in alleviating DOMS after high-intensity exercise . Further research is needed in various population PURPOSE To compare whole-body vibration ( WBV ) with traditional recovery protocol s after a high-intensity training bout . METHODS In a r and omized crossover study , 16 athletes performed 6 × 30-s Wingate sprints before completing either an active recovery ( 10 min of cycling and stretching ) or WBV for 10 min in a series of exercises on a vibration platform . Muscle hemodynamics ( assessed via near-infrared spectroscopy ) were measured before and during exercise and into the 10-min recovery period . Blood lactate concentration , vertical jump , quadriceps strength , flexibility , rating of perceived exertion ( RPE ) , muscle soreness , and performance during a single 30-s Wingate test were assessed at baseline and 30 and 60 min postexercise . A subset of participants ( n = 6 ) completed a 3rd identical trial ( 1 wk later ) using a passive 10-min recovery period ( sitting ) . RESULTS There were no clear effects between the recovery protocol s for blood lactate concentration , quadriceps strength , jump height , flexibility , RPE , muscle soreness , or single Wingate performance across all measured recovery time points . However , the WBV recovery protocol substantially increased the tissue-oxygenation index compared with the active ( 11.2 % ± 2.4 % [ mean ± 95 % CI ] , effect size [ ES ] = 3.1 , and -7.3 % ± 4.1 % , ES = -2.1 for the 10 min postexercise and postrecovery , respectively ) and passive recovery conditions ( 4.1 % ± 2.2 % , ES = 1.3 , 10 min postexercise only ) . CONCLUSION Although WBV during recovery increased muscle oxygenation , it had little effect in improving subsequent performance compared with a normal active recovery Objective : The aim of this study was to test the hypothesis that vibration treatment reduces delayed‐onset muscle soreness and swelling and enhances recovery of muscle function after eccentric exercise . Design : A r and omized crossover design was used . Fifteen young men performed ten sets of six maximal eccentric contractions of the elbow flexors with the right arm for one occasion and the left arm for the other occasion separated by 4 wks . One arm received a 30‐min vibration treatment at 30 mins after and 1 , 2 , 3 , and 4 days after the exercise ( treatment group ) , and the other arm did not receive any treatment ( control group ) . The order of the treatment and control conditions and the use of the dominant and nondominant arms were counterbalanced among subjects . Changes in indirect markers of muscle damage were compared between arms by a two‐way repeated‐ measures analysis of variance . Results : Compared with the control group , the treatment group showed significantly ( P < 0.05 ) less development and faster reduction in delayed‐onset muscle soreness at 2 to 5 days after exercise . The recovery of range of motion was significantly ( P < 0.05 ) faster for the treatment than for the control group . However , no significant effects on the recovery of muscle strength and serum creatine kinase activity were evident . Immediately after the vibration treatment , a significant ( P < 0.05 ) decrease in the magnitude of delayed‐onset muscle soreness and muscle strength and an increase in pressure pain threshold and range of motion were found . Conclusions : These results showed that the vibration treatment was effective for attenuation of delayed‐onset muscle soreness and recovery of range of motion after strenuous eccentric exercise but did not affect swelling , recovery of muscle strength , and serum creatine kinase activity Determining muscle contractile properties following exercise is critical in underst and ing neuromuscular function . Following high intensity training , individuals often experience exercise induced muscle damage ( EIMD ) . The purpose of this investigation was to determine the effect of whole-body vibration ( WBV ) on muscle contractile properties following EIMD . Twenty-seven females volunteered for 7 sessions and were r and omly assigned to a treatment or control group . Muscle contractile properties were assessed via voluntary torque ( VT ) , peak twitch torque ( TT ) , time to reach peak torque , half relaxation time of twitch torque , percent activation ( % ACT ) , rate of rise ( RR ) , rate of decline ( RD ) , mean and peak electromyography during maximum voluntary isometric contraction . Two testing sets were collected each day , consisting of pre measures followed by WBV or control and post measures . A mixed factor analysis of variance was conducted for each variable . % ACT measures found baseline being less than day 1 in both measures in the control group . TT was found to be greater in the control group compared to WBV group . TT and VT baseline measures were greater than all other time points . RR showed control group had higher values than WBV group . These results indicate that WBV following EIMD had some positive effects on muscle contractile properties Rhea , MR , Bunker , D , Marín , PJ , and Lunt , K. Effect of iTonic whole-body vibration on delayed-onset muscle soreness among untrained individuals . J Strength Cond Res 23(6 ) : 1677 - 1682 , 2009-Attempts to reduce or eliminate delayed-onset of muscle soreness are important as this condition is painful and debilitating . The purpose of this study was to examine the effectiveness of whole-body vibration ( WBV ) massage and stretching exercises at reducing perceived pain among untrained men . Sixteen adult men ( age , 36.6 ± 2.1 yr ) volunteered to perform a strenuous exercise session consisting of resistance training and repeated sprints . Subjects were r and omly assigned to 1 of 2 recovery groups : a group performing WBV stretching sessions or a stretching group performing static stretching without vibration . Both groups performed similar stretches , twice per day for 3 days after the workout . The vibration group performed their stretches on the iTonic platform ( frequency , 35 Hz ; amplitude , 2 mm ) . Perceived pain was measured at 12 , 24 , 48 , and 72 hours postworkout . Statistical analyses identified a significantly lower level of reported perceived pain at all postworkout measurement times among the WBV group ( p < 0.05 ) . No difference existed at the preworkout measurement time . The degree of attenuation of pain ranged from 22 - 61 % . These data suggest that incorporating WBV as a recovery/regeneration tool may be effective for reducing the pain of muscle soreness and tightness after strenuous training
11,060
25,425,128
We found minimal evidence to support the hypothesis that prescribed physical activity/exercise training results in decreased non-exercise physical activity/energy expenditure in healthy adults .
Prescribed physical activity/exercise training may reduce non-exercise physical activity result ing in no change in total daily energy expenditure and no or minimal exercise-induced weight loss .
Overweight and obesity affects more than 66 % of the adult population and is associated with a variety of chronic diseases . Weight reduction reduces health risks associated with chronic diseases and is therefore encouraged by major health agencies . Guidelines of the National Heart , Lung , and Blood Institute ( NHLBI ) encourage a 10 % reduction in weight , although considerable literature indicates reduction in health risk with 3 % to 5 % reduction in weight . Physical activity ( PA ) is recommended as a component of weight management for prevention of weight gain , for weight loss , and for prevention of weight regain after weight loss . In 2001 , the American College of Sports Medicine ( ACSM ) published a Position St and that recommended a minimum of 150 min wk(-1 ) of moderate-intensity PA for overweight and obese adults to improve health ; however , 200 - 300 min wk(-1 ) was recommended for long-term weight loss . More recent evidence has supported this recommendation and has indicated more PA may be necessary to prevent weight regain after weight loss . To this end , we have reexamined the evidence from 1999 to determine whether there is a level at which PA is effective for prevention of weight gain , for weight loss , and prevention of weight regain . Evidence supports moderate-intensity PA between 150 and 250 min wk(-1 ) to be effective to prevent weight gain . Moderate-intensity PA between 150 and 250 min wk(-1 ) will provide only modest weight loss . Greater amounts of PA ( > 250 min wk(-1 ) ) have been associated with clinical ly significant weight loss . Moderate-intensity PA between 150 and 250 min wk(-1 ) will improve weight loss in studies that use moderate diet restriction but not severe diet restriction . Cross-sectional and prospect i ve studies indicate that after weight loss , weight maintenance is improved with PA > 250 min wk(-1 ) . However , no evidence from well- design ed r and omized controlled trials exists to judge the effectiveness of PA for prevention of weight regain after weight loss . Resistance training does not enhance weight loss but may increase fat-free mass and increase loss of fat mass and is associated with reductions in health risk . Existing evidence indicates that endurance PA or resistance training without weight loss improves health risk . There is inadequate evidence to determine whether PA prevents or attenuates detrimental changes in chronic disease risk during weight gain Background It has been suggested that exercise training results in compensatory mechanisms that attenuate weight loss . However , this has only been examined with large doses of exercise . The goal of this analysis was to examine actual weight loss compared to predicted weight loss ( compensation ) across different doses of exercise in a controlled trial of sedentary , overweight or obese postmenopausal women ( n = 411 ) . Methodology /Principal Findings Participants were r and omized to a non-exercise control ( n = 94 ) or 1 of 3 exercise groups ; exercise energy expenditure of 4 ( n = 139 ) , 8 ( n = 85 ) , or 12 ( n = 93 ) kcal/kg/week ( KKW ) . Training intensity was set at the heart rate associated with 50 % of each woman 's peak VO2 and the intervention period was 6 months . All exercise was supervised . The main outcomes were actual weight loss , predicted weight loss ( exercise energy expenditure/ 7700 kcal per kg ) , compensation ( actual minus predicted weight loss ) and waist circumference . The study sample had a mean ( SD ) age 57.2 ( 6.3 ) years , BMI of 31.7 ( 3.8 ) kg/m2 , and was 63.5 % Caucasian . The adherence to the intervention was > 99 % in all exercise groups . The mean ( 95 % CI ) weight loss in the 4 , 8 and 12 KKW groups was −1.4 ( −2.0 , −0.8 ) , −2.1 ( −2.9 , −1.4 ) and −1.5 ( −2.2 , −0.8 ) kg , respectively . In the 4 and 8 KKW groups the actual weight loss closely matched the predicted weight loss of −1.0 and −2.0 kg , respectively , result ing in no significant compensation . In the 12 KKW group the actual weight loss was less than the predicted weight loss ( −2.7 kg ) result ing in 1.2 ( 0.5 , 1.9 ) kg of compensation ( P<0.05 compared to 4 and 8 KKW groups ) . All exercise groups had a significant reduction in waist circumference which was independent of changes in weight . Conclusion In this study of previously sedentary , overweight or obese , postmenopausal women we observed no difference in the actual and predicted weight loss with 4 and 8 KKW of exercise ( 72 and 136 minutes respectively ) , while the 12 KKW ( 194 minutes ) produced only about half of the predicted weight loss . However , all exercise groups had a significant reduction in waist circumference which was independent of changes in weight . Trial Registration Clinical Trials.gov NCT Keeping moderately active is the best way to boost total daily energy expenditure . BACKGROUND It is not clear how decreased activity quantitatively affects energy balance ( EB ) in subjects feeding ad libitum . OBJECTIVE We assessed the effect of an imposed sedentary routine on appetite , energy intake ( EI ) , EB , and nutrient balance in lean men for 7 d. DESIGN Six men with a mean ( + /-SD ) age of 23.0 + /- 2.3 y , weight of 69.2 + /- 11.4 kg , and height of 1.76 + /- 0.07 m were each studied twice during a sedentary [ 1.4 x resting metabolic rate ( RMR ) ] and a moderately active ( 1.8 x RMR ) regimen . During each treatment , they resided in the whole-body indirect calorimeter for the 7 d and had ad libitum access to a medium-fat diet of constant , measurable composition . Meal size , frequency , and composition were continually monitored . Motivation to eat was recorded during waking hours . Subjects were weighed in light clothing each morning , and their weight was corrected to nude . RESULTS Energy expenditure was 9.7 and 12.8 MJ/d [ P < 0.01 ; SE of the difference between means ( SED ) = 0.41 ] during the sedentary and active regimens , respectively . EI was 13.5 and 14.4 MJ/d ( P = 0.463 , SED = 1.06 ) , respectively . There was no regimen effect on hunger , appetite , or body weight . By day 7 , cumulative EB was 26.3 and 11.1 MJ , respectively . CONCLUSIONS Reducing a level of physical activity from 1.8 to 1.4 x RMR can markedly affect EB . A sedentary routine does not induce a compensatory reduction of EI and leads to a significantly positive EB , most of which is stored as fat Overwhelming evidence shows the quality of reporting of r and omised controlled trials ( RCTs ) is not optimal . Without transparent reporting , readers can not judge the reliability and validity of trial findings nor extract information for systematic review s. Recent method ological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects . Such systematic error is seriously damaging to RCTs , which are considered the gold st and ard for evaluating interventions because of their ability to minimise or avoid bias . A group of scientists and editors developed the CONSORT ( Consoli date d St and ards of Reporting Trials ) statement to improve the quality of reporting of RCTs . It was first published in 1996 and up date d in 2001 . The statement consists of a checklist and flow diagram that authors can use for reporting an RCT . Many leading medical journals and major international editorial groups have endorsed the CONSORT statement . The statement facilitates critical appraisal and interpretation of RCTs . During the 2001 CONSORT revision , it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports . A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement . After an expert meeting in January 2007 , the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement . This up date improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition , such as selective outcome reporting bias . This explanatory and elaboration document-intended to enhance the use , underst and ing , and dissemination of the CONSORT statement-has also been extensively revised . It presents the meaning and rationale for each new and up date d checklist item providing examples of good reporting and , where possible , references to relevant empirical studies . Several examples of flow diagrams are included . The CONSORT 2010 Statement , this revised explanatory and elaboration document , and the associated website ( www.consort-statement.org ) should be helpful re sources to improve reporting of r and omised trials BACKGROUND Recent r and omized controlled trials indicated that exercise training for elderly significantly increased their physical fitness . However , very few studies have examined changes in physical activity after exercise training . The purpose of this study was to investigate whether six-month exercise training for older adults can increase and maintain their physical activity in daily life . METHODS Sixty-two men and women aged 60 to 81 years ( mean age 67.1 years ) , living in communities , were r and omly allocated into an exercise group ( n = 32 ) or a control group ( n = 33 ) . The intervention started in April 1998 and lasted for 25 weeks . The exercise regimen consisted of endurance training and resistance exercises in a two-hour class conducted at least twice a week . The subjects completed a physical activity diary at each pre-intervention ( March 1998 ) , post-intervention ( September 1998 ) and follow-up ( April 1999 ) measurement of physical activity . Physical activity , expressed as total daily energy expenditure , was calculated by multiplying the amount of time spent in each activity and the corresponding METs . RESULTS Total daily energy expenditure significantly increased from 40.8 kcal/kg/day to 43.5 kcal/kg/day in the exercise group ( p = 0.03 ) , but did not change in the control group . At the follow-up measurement , the mean total daily energy expenditure in the exercise group remained significantly higher , by 1.7 kcal/kg/day , than that at the pre-intervention ( p = 0.05 ) . CONCLUSIONS This r and omized controlled trial indicated that exercise training for elderly was effective in increasing physical activity in daily life Exercise is recommended by public health agencies for weight management ; however , the role of exercise is generally considered secondary to energy restriction . Few studies exist that have verified completion of exercise , measured the energy expenditure of exercise , and prescribed exercise with equivalent energy expenditure across individuals and genders . Objective The objective of this study was to evaluate aerobic exercise , without energy restriction , on weight loss in sedentary overweight and obese men and women . Design and Methods This investigation was a r and omized , controlled , efficacy trial in 141 overweight and obese participants ( body mass index , 31.0 ± 4.6 kg/m2 ; age 22.6 ± 3.9 years ) . Participants were r and omized ( 2:2:1 ratio ) to exercise at either 400 kcal/session or 600 kcal/session or to a non-exercise control . Exercise was supervised , 5 days/week , for 10 months . All participants were instructed to maintain usual ad libitum diets . Due to the efficacy design , completion of ≥ 90 % of exercise sessions was an a priori definition of per protocol , and these participants were included in the analysis . Results Weight loss from baseline to 10 months for the 400 and 600 kcal/session groups was 3.9 ± 4.9 kg ( 4.3 % ) and 5.2 ± 5.6 kg ( 5.7 % ) , respectively compared to weight gain for controls of 0.5 ± 3.5 kg ( 0.5 % ) ( p<0.05 ) . Differences for weight loss from baseline to 10 months between the exercise groups and differences between men and women within groups were not statistically significant . Conclusions Supervised exercise , with equivalent energy expenditure , results in clinical ly significant weight loss with no significant difference between men and women BACKGROUND It is currently unclear how physical activity and diet interact within the ranges of activity seen in the general population . This study aim ed to establish whether a small , acute , increase in physical activity would lead to compensatory change in energy intake and nutrient balance , and to provide power analysis data for future research in this field . METHOD Twelve participants were studied over 7 days of habitual activity and 2 weeks after instruction to increase physical activity by 2000 steps per day . Physical activity was assessed using a diary , the ' activPAL ' activity monitor and a pedometer . Dietary analyses from prospect i ve food diaries were compared between the first and third weeks . RESULTS Participants increased step-counts ( + 2600 steps per day , P = 0.008 ) and estimated energy expenditure ( + 300 - 1000 kJ day(-1 ) , P = 0.002 ) but did not significantly change their energy intake , dietary composition or number of meals per day . From reverse power analysis 38 participants would be needed to exclude a change in energy intake of 400 kJ day(-1 ) with 90 % power at P < 0.05 ; 400 kJ day(-1 ) would compensate for a 2000 steps per day increase in physical activity . CONCLUSION These results did not demonstrate any compensatory increase in food consumption when physical activity was increased by walking an average of 2600 additional steps per day . Power analysis indicates that a larger study ( n = 38 ) will be necessary to exclude such an effect with confidence PURPOSE An active lifestyle is widely recognized as having a beneficial effect on cardiovascular health . However , no clear consensus exists as to whether exercise training increases overall physical activity energy expenditure ( PAEE ) or whether individuals participating in regular exercise compensate by reducing their off-exercise physical activity . The purpose of this study was to evaluate changes in PAEE in response to aerobic training ( AT ) , resistance training ( RT ) , or combined aerobic and resistance training ( AT/RT ) . METHODS Data are from 82 participants in the Studies of Targeted Risk Reduction Interventions through Defined Exercise-Aerobic Training versus Resistance Training study , a r and omized trial of overweight ( body mass index = 25 - 35 kg·m(-2 ) ) adults , in which participants were r and omized to receive 8 months of AT , RT , or AT/RT . All subjects completed a 4-month control period before r and omization . PAEE was measured using triaxial RT3 accelerometers , which subjects wore for a 5- to 7-d period before and after the exercise intervention . Data reduction was performed with a previously published computer-based algorithm . RESULTS There was no significant change in off-exercise PAEE in any of the exercise training groups . We observed a significant increase in total PAEE that included the exercise training , in both AT and AT/RT but not in RT . CONCLUSIONS Eight months of exercise training was not associated with a compensatory reduction in off-exercise physical activity , regardless of exercise modality . The absence of compensation is particularly notable for AT/RT subjects , who performed a larger volume of exercise than did AT or RT subjects . We believe that the extended duration of our exercise training program was the key factor in allowing subjects to reach a new steady-state level of physical activity within their daily lives We assessed the effect of no exercise ( Nex ; control ) and high exercise level ( Hex ; approximately 4 MJ/day ) and two dietary manipulations [ a high-fat diet ( HF ; 50 % of energy , 700 kJ/100 g ) and low-fat diet ( LF ; 20 % of energy , 300 kJ/100 g ) ] on compensatory changes in energy intake ( EI ) and energy expenditure ( EE ) over 7-day periods . Eight lean men were each studied four times in a 2 x 2 r and omized design . EI was directly quantified by weight of food consumed . EE was assessed by heart rate ( HR ) monitoring . Body weight was measured daily . Mean daily EE was 17.6 and 11.5 MJ/day ( P < 0.001 ) on the pooled Hex and Nex treatments , respectively . EI was higher on HF diets ( 13.4 MJ/day pooled ) compared with the LF diets ( 9.0 MJ/day ) . Regression analysis showed that these energy imbalances induced significant compensatory changes in EB over time of approximately 0.3 - 0.4 MJ/day ( P < 0.05 ) . These were due to changes in both EI and EE in the opposite direction to the perturbation in energy balance . These changes were significant , small but persistent , amounting to approximately 0.2 and approximately 0.35 MJ/day for EI and EE , respectively INTRODUCTION We examined the effects of three exercise training interventions on total physical activity energy expenditure ( PAEE ) or nonexercise PAEE in a r and omized controlled trial where sedentary , overweight , and obese men and women were assigned to inactive control , low-amount/moderate-intensity , low-amount/vigorous-intensity , or high-amount/vigorous-intensity aerobic exercise . METHODS To measure PAEE , triaxial RT3 accelerometers were worn by subjects for 7 d at the beginning and end of an 8-month exercise intervention . In total , 50 subjects ( control , n = 8 ; two low-amount groups , n = 28 ; high-amount group , n = 14 ) had usable PAEE data collected at both time points . RESULTS At baseline , subjects had an average age of 53.2 yr , had a body mass index of 29.7 kg x m(-2 ) , and a relative peak VO2 of 28.7 mL x kg(-1 ) x min(-1 ) . There were no significant differences between groups at baseline . After the intervention , average change in total PAEE was 8.4 + /- 20.9 kJ x h(-1 ) for controls , 58.6 + /- 20.9 kJ x h(-1 ) for the two low-amount groups , and 138.1 + /- 33.5 kJ x h(-1 ) for the high-amount group ( means + /- SE ) . The high-amount group experienced a significantly greater increase in total PAEE compared with the controls ( P = 0.02 ) . As expected , total PAEE increased with increasing exercise volume . Average change in nonexercise PAEE was 8.4 + /- 20.9 kJ x h(-1 ) for control , 25.1 + /- 20.9 kJ x h(-1 ) for the low-amount groups combined , and 62.8 + /- 29.3 kJ x h(-1 ) for the high-amount group . There was no statistically significant difference in change of nonexercise PAEE among groups . CONCLUSIONS We conclude that in middle-aged overweight or obese subjects participating in an extended exercise intervention , total PAEE increased , and there was no compensatory decrease in nonexercise PAEE The amount of weight loss induced by exercise is often disappointing . A diet-induced negative energy balance triggers compensatory mechanisms , e.g. , lower metabolic rate and increased appetite . However , knowledge about potential compensatory mechanisms triggered by increased aerobic exercise is limited . A r and omized controlled trial was performed in healthy , sedentary , moderately overweight young men to examine the effects of increasing doses of aerobic exercise on body composition , accumulated energy balance , and the degree of compensation . Eighteen participants were r and omized to a continuous sedentary control group , 21 to a moderate-exercise ( MOD ; 300 kcal/day ) , and 22 to a high-exercise ( HIGH ; 600 kcal/day ) group for 13 wk , corresponding to ∼30 and 60 min of daily aerobic exercise , respectively . Body weight ( MOD : -3.6 kg , P < 0.001 ; HIGH : -2.7 kg , P = 0.01 ) and fat mass ( MOD : -4.0 kg , P < 0.001 and HIGH : -3.8 kg , P < 0.001 ) decreased similarly in both exercise groups . Although the exercise-induced energy expenditure in HIGH was twice that of MOD , the result ing accumulated energy balance , calculated from changes in body composition , was not different ( MOD : -39.6 Mcal , HIGH : -34.3 Mcal , not significant ) . Energy balance was 83 % more negative than expected in MOD , while it was 20 % less negative than expected in HIGH . No statistically significant changes were found in energy intake or nonexercise physical activity that could explain the different compensatory responses associated with 30 vs. 60 min of daily aerobic exercise . In conclusion , a similar body fat loss was obtained regardless of exercise dose . A moderate dose of exercise induced a markedly greater than expected negative energy balance , while a higher dose induced a small but quantifiable degree of compensation BACKGROUND In light of the current obesity epidemic , treatment models are needed that can prevent weight gain or provide weight loss . We examined the long-term effects of a supervised program of moderate-intensity exercise on body weight and composition in previously sedentary , overweight and moderately obese men and women . We hypothesized that a 16-month program of verified exercise would prevent weight gain or provide weight loss in the exercise group compared with controls . METHODS This was a r and omized controlled efficacy trial . Participants were recruited from 2 midwestern universities and their surrounding communities . One hundred thirty-one participants were r and omized to exercise or control groups , and 74 completed the intervention and all laboratory testing . Exercise was supervised , and the level of energy expenditure of exercise was measured . Controls remained sedentary . All participants maintained ad libitum diets . RESULTS Exercise prevented weight gain in women and produced weight loss in men . Men in the exercise group had significant mean + /- SD decreases in weight ( 5.2 + /- 4.7 kg ) , body mass index ( calculated as weight in kilograms divided by the square of height in meters ) ( 1.6 + /- 1.4 ) , and fat mass ( 4.9 + /- 4.4 kg ) compared with controls . Women in the exercise group maintained baseline weight , body mass index , and fat mass , and controls showed significant mean + /- SD increases in body mass index ( 1.1 + /- 2.0 ) , weight ( 2.9 + /- 5.5 kg ) , and fat mass ( 2.1 + /- 4.8 kg ) at 16 months . No significant changes occurred in fat-free mass in either men or women ; however , both had significantly reduced visceral fat . CONCLUSIONS Moderate-intensity exercise sustained for 16 months is effective for weight management in young adults Effects of progressive endurance training on energy expenditure ( EE ) were studied in thirteen elderly sedentary subjects ( 62.8 ( SD 2.3 ) years ) after 7 and 14 weeks of training . Daily EE ( DEE ) and energy cost of the various usual activities were measured over 48 h by whole-body indirect calorimetry . Free-living DEE ( DEEFLC ) was calculated from 7 d activity recordings and the energy costs of activities were measured in the calorimeters using the factorial method . DEEFLC did not vary significantly throughout the training period despite the additional energy cost of training sessions ( 0.60 ( SD 0.15 ) MJ/d ) , because energy expended during free-living activities ( EEACT ) decreased by 4.8 ( SD 7.1)% ( P < 0.05 ) and 7.7 ( SD 8.6)% ( P < 0.01 ) after 7 and 14 weeks of training respectively . Measurements in the calorimeters showed that sleeping metabolic rate transiently increased by 4.6 ( SD 3.2)% after 7 weeks of training ( P < 0.001 ) and returned to its initial level after 14 weeks of training . BMR was 7.6 ( SD 7.0)% ( P < 0.01 ) and 4.1 ( SD 6.1)% ( P = NS ) higher after 7 and 14 weeks of training respectively , than before training . Likewise , diet-induced thermogenesis increased from 3.7 ( SD 2.5 ) to 7.2 ( SD 2.8)% energy intake after 7 weeks of training ( P < 0.05 ) , and returned to its initial level after 14 weeks of training ( 4.2 ( SD 2.6)% energy intake ) . Despite these changes , energy expended during activities and the corresponding DEE did not vary throughout the training period . It was concluded that : ( 1 ) DEEFLC remained constant throughout the training period due to a compensatory decrease in free-living EEACT ; ( 2 ) progressive endurance training induced a transient increase in sleeping metabolic rate , BMR and diet-induced thermogenesis after 7 weeks which was not reflected in the energy expended during activities and DEE Overwhelming evidence shows the quality of reporting of r and omised controlled trials ( RCTs ) is not optimal . Without transparent reporting , readers can not judge the reliability and validity of trial findings nor extract information for systematic review s. Recent method ological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects . Such systematic error is seriously damaging to RCTs , which are considered the gold st and ard for evaluating interventions because of their ability to minimise or avoid bias . A group of scientists and editors developed the CONSORT ( Consoli date d St and ards of Reporting Trials ) statement to improve the quality of reporting of RCTs . It was first published in 1996 and up date d in 2001 . The statement consists of a checklist and flow diagram that authors can use for reporting an RCT . Many leading medical journals and major international editorial groups have endorsed the CONSORT statement . The statement facilitates critical appraisal and interpretation of RCTs . During the 2001 CONSORT revision , it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports . A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement . After an expert meeting in January 2007 , the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement . This up date improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition , such as selective outcome reporting bias . This explanatory and elaboration document-intended to enhance the use , underst and ing , and dissemination of the CONSORT statement-has also been extensively revised . It presents the meaning and rationale for each new and up date d checklist item providing examples of good reporting and , where possible , references to relevant empirical studies . Several examples of flow diagrams are included . The CONSORT 2010 Statement , this revised explanatory and elaboration document , and the associated website ( www.consort-statement.org ) should be helpful re sources to improve reporting of r and omised trials BACKGROUND Exercise interventions elicit only modest weight loss , which might reflect a compensatory reduction in nonprescribed physical activity energy expenditure ( PAEE ) . OBJECTIVE The objective was to investigate whether there is a reduction in nonprescribed PAEE as a result of participation in a 6-mo structured exercise intervention in middle-aged men . DESIGN Sedentary male participants [ age : 54 ± 5 y ; body mass index ( in kg/m² ) : 28 ± 3 ] were r and omly assigned to a 6-mo progressive exercise ( EX ) or control ( CON ) group . Energy expenditure during structured exercise ( prescribed PAEE ) and nonprescribed PAEE were determined with the use of synchronized accelerometry and heart rate before the intervention , during the intervention ( 2 , 9 , and 18 wk ) , and within a 2-wk period of detraining after the intervention . RESULTS Structured prescribed exercise increased total PAEE and had no detrimental effect on nonprescribed PAEE . Indeed , there was a trend for greater nonprescribed PAEE in the EX group ( P = 0.09 ) . Weight loss in the EX group ( -1.8 ± 2.2 kg compared with + 0.2 ± 2.2 kg in the CON group , P < 0.02 ) reflected only ≈40 % of the 300 - 373 kcal/kg body mass potential energy deficit from prescribed exercise . Serum leptin concentration decreased by 24 % in the EX group ( compared with 3 % in the CON group , P < 0.03 ) , and we estimate that this was accompanied by a compensatory increase in energy intake of ≈100 kcal/d . CONCLUSIONS The adoption of regular structured exercise in previously sedentary , middle-aged , and overweight men does not result in a negative compensatory reduction in nonprescribed physical activity . The less-than-predicted weight loss is likely to reflect a compensatory increase in energy intake in response to a perceived state of relative energy insufficiency
11,061
25,790,326
The trial showed that surgical intervention result ed in a higher percentage of participants with pain relief and better preservation of pancreatic function . For patients with obstructive chronic pancreatitis and dilated pancreatic duct , this review shows that surgery is superior to endoscopy in terms of pain relief . Morbidity and mortality seem not to differ between the two intervention modalities , but the small trials identified do not provide sufficient power to detect the small differences expected in this outcome .Regarding the comparison of surgical intervention versus conservative treatment , this review has shown that surgical intervention in an early stage of chronic pancreatitis is a promising approach in terms of pain relief and pancreatic function .
BACKGROUND Endoscopy and surgery are the treatment modalities of choice for patients with chronic pancreatitis and dilated pancreatic duct ( obstructive chronic pancreatitis ) . Physicians face , without clear consensus , the choice between endoscopy or surgery for this group of patients . OBJECTIVES To assess and compare the effects and complications of surgical and endoscopic interventions in the management of pain for obstructive chronic pancreatitis .
BACKGROUND For patients with chronic pancreatitis and a dilated pancreatic duct , ductal decompression is recommended . We conducted a r and omized trial to compare endoscopic and surgical drainage of the pancreatic duct . METHODS All symptomatic patients with chronic pancreatitis and a distal obstruction of the pancreatic duct but without an inflammatory mass were eligible for the study . We r and omly assigned patients to undergo endoscopic transampullary drainage of the pancreatic duct or operative pancreaticojejunostomy . The primary end point was the average Izbicki pain score during 2 years of follow-up . The secondary end points were pain relief at the end of follow-up , physical and mental health , morbidity , mortality , length of hospital stay , number of procedures undergone , and changes in pancreatic function . RESULTS Thirty-nine patients underwent r and omization : 19 to endoscopic treatment ( 16 of whom underwent lithotripsy ) and 20 to operative pancreaticojejunostomy . During the 24 months of follow-up , patients who underwent surgery , as compared with those who were treated endoscopically , had lower Izbicki pain scores ( 25 vs. 51 , P<0.001 ) and better physical health summary scores on the Medical Outcomes Study 36-Item Short-Form General Health Survey question naire ( P=0.003 ) . At the end of follow-up , complete or partial pain relief was achieved in 32 % of patients assigned to endoscopic drainage as compared with 75 % of patients assigned to surgical drainage ( P=0.007 ) . Rates of complications , length of hospital stay , and changes in pancreatic function were similar in the two treatment groups , but patients receiving endoscopic treatment required more procedures than did patients in the surgery group ( a median of eight vs. three , P<0.001 ) . CONCLUSIONS Surgical drainage of the pancreatic duct was more effective than endoscopic treatment in patients with obstruction of the pancreatic duct due to chronic pancreatitis . ( Current Controlled Trials number , IS RCT N04572410 [ controlled-trials.com ] . ) OBJECTIVE This study evaluated the effect of operative drainage of the main pancreatic duct ( MPD ) on functional derangements associated with chronic pancreatitis ( CP ) . SUMMARY BACKGROUND DATA The author previously reported delayed functional impairment in an evaluation of the impact of operative drainage in patients with CP . The author now reports on a prospect i ve study of 143 patients with this diagnosis . METHODS Each patient underwent 1 ) ERCP , 2 ) the Bentiromide PABA , 3 ) 72-hour fecal fat test , 4 ) oral glucose tolerance test ( OGTT ) and 5 ) fat meal (LIPOMUL)--stimulated pancreatic polypeptide release ( PP ) . All patients were stratified as mild/moderate ( M/M ) or severe CP on the basis of a 5-point system that was developed by the author . Patients were studied at 16-month intervals . RESULTS All 143 patients underwent initial and follow-up evaluations in a mean follow-up of 47.3 months ; 83 of 143 patients had M/M grade at initial evaluation . Eighty-seven patients underwent ( MPD ) decompression to relieve abdominal pain . In a separate prospect i ve 17 patients with a diagnosis of CP , a grade of M/M and non-disabling abdominal pain were r and omized to operative or non-operative treatment ; 9 of these r and omized patients were operated upon and 8 were not . No patient improved their grade during follow-up ; 47 of 83 M/M patients had operative drainage and 36 did not . This grade was preserved in 41 of 47 ( 87 % ) operated patients but in only 8 of the 36 non-operated patients ( 22 % ) . In the r and omized trial , seven of nine operated patients retained their functional status in follow-up , whereas only two of eight patients ( 25 % ) r and omized to non-operation preserved their functional grade . CONCLUSIONS These data in this large study as well as among a previous r and omized sample , support a policy of early operative drainage before the development of irreversible functional impairment in patients with chronic pancreatitis and associated dilation of the main pancreatic duct OBJECTIVE To analyze the efficacy of extended drainage -- that is , longitudinal pancreaticojejunostomy combined with local pancreatic head excision (LPJ-LPHE)- and pylorus-preserving pancreatoduodenectomy ( PPPD ) in terms of pain relief , control of complications arising from adjacent organs , and quality of life . SUMMARY BACKGROUND DATA Based on the hypotheses of pain origin ( ductal hypertension and perineural inflammatory infiltration ) , drainage and resection constitute the main principles of surgery for chronic pancreatitis . METHODS Sixty-one patients were r and omly allocated to either LPJ-LPHE ( n = 31 ) or PPPD ( n = 30 ) . The interval between symptoms and surgery ranged from 12 months to 10 years ( mean 5.1 years ) . In addition to routine pancreatic diagnostic workup , a multidimensional psychometric quality -of-life question naire and a pain score were used . Endocrine and exocrine functions were assessed in terms of oral glucose tolerance and serum concentrations of insulin , C-peptide , and HbA1c , as well as fecal chymotrypsin and pancreolauryl testing . During a median follow-up of 24 months ( range 12 to 36 ) , patients were reassessed in the outpatient clinic . RESULTS One patient died of cardiovascular failure in the LPJ-LPHE group ( 3.2 % ) ; there were no deaths in the PPPD group . Overall , the rate of in-hospital complications was 19.4 % in the LPJ-LPHE group and 53.3 % in the PPPD group , including delayed gastric emptying in 9 of 30 patients ( 30 % ; p < 0.05 ) . Complications of adjacent organs were definitively resolved in 93.5 % in the LPJ-LPHE group and in 100 % in the PPPD group . The pain score decreased by 94 % after LPJ-LPHE and by 95 % after PPPD . Global quality of life improved by 71 % in the LPJ-LPHE group and by 43 % in the PPPD group ( p < 0.01 ) . CONCLUSIONS Both procedures are equally effective in terms of pain relief and definitive control of complications affecting adjacent organs , but extended drainage by LPJ-LPHE provides a better quality of life Background This study aim ed to compare the outcomes of endoscopic treatment ( ET ) and surgical treatment ( ST ) for common bile duct ( CBD ) stricture in patients with chronic pancreatitis ( CP ) . Methods From 2004 to 2009 , 39 patients ( 35 men and 4 women ; median age , 52 years ; range , 38–66 years ) were referred for CBD stricture in CP . Of these 39 patients , 33 ( 85 % ) underwent primary ET , and 6 underwent primary ST . Treatment success was defined in both groups as the absence of signs denoting recurrence , with normal serum bilirubin and alkaline phosphatase levels after permanent stent removal in ET group . The follow-up period was longer than 12 months for all the patients . Results For the patients treated with ET , the mean number of biliary procedures was 3 ( range , 1–10 ) per patient including extractible metallic stents in 35 % and multiple plastic stents in 65 % of the patients . The mean duration of stent intubation was 11 months . The surgical procedure associated with biliary drainage ( 4 choledochoduodenostomies , 1 choledochojejunostomy , and 1 biliary decompression within the pancreatic head ) was a Frey procedure for five patients and a pancreaticojejunostomy for one patient . The overall morbidity rate was higher in the ST group . The total hospital length of stay was similar in the two groups ( 16 vs 24 days , respectively ; p = 0.21 ) . In terms of intention to treat , the success rates for ST and ET did not differ significantly ( 83 % vs 76 % ; p = 0.08 ) . Due to failure , 17 patients required ST after ET . Event-free survival was significantly longer in the ST group ( 16.9 vs 5.8 months ; p = 0.01 ) . The actuarial success rates were 74 % at 6 months , 74 % at 12 months , and 65 % at 24 months in the ST group and respectively 75 % , 69 % , and 12 % in the ET group ( p = 0.01 ) . After more than three endoscopic procedures , the success rates were 27 % at 6 months and 18 % at 18 months . Conclusion For bile duct stricture in CP , surgery is associated with better long-term outcomes than endoscopic therapy . After more than three endoscopic procedures , the success rate is low This paper discusses the various philosophies that influence the selection of patients for entry into r and omized controlled trials . Although a number of different and often competing issues have to be considered depending upon the trial , keeping entry criteria simple , wide and at times even flexible is usually preferable . Such a strategy can be a positive virtue by helping to attain the large numbers of patients that are usually needed to reliably detect the sorts of moderate benefits that are plausible , at a reasonable cost and by providing answers that are relevant to many different categories of patients with a particular condition Background A number of studies support the use of endoscopically placed pancreatic duct ( PD ) stents to decrease pain in chronic pancreatitis ( CP ) . Nevertheless , flaws in study design have prevented experts from reaching a consensus . Purpose ( 1 ) Evaluate the efficacy of PD stenting to ameliorate abdominal pain in patients with CP and ductal strictures ; ( 2 ) evaluate the placebo response rate from sham endoscopic therapy ; ( 3 ) compare pain medication usage , healthcare utilization , psychological distress , and quality of life before and after endoscopic stenting ; ( 4 ) prospect ively evaluate the durability of the response . Methods Patients with typical abdominal pain , imaging confirmation of CP and endoscopic retro grade cholangiopancreatography ( ERCP ) confirmation of PD stricture will complete question naires to assess quality of life , psychological distress , pain intensity/unpleasantness , pain medication usage , and healthcare utilization . Enrolled patients will be r and omized to ERCP with sphincterotomy and PD stenting versus sham procedure . Pain level and medication usage will be assessed weekly with telephone interviews . At 6—8 weeks , patients treated with stents will undergo stent removal ; those r and omized to the sham procedure without significant improvement ( < 50 % reduction in pain score ) will cross over to the treatment group ; and those r and omized to sham procedure who experienced improvement ( > 50 % reduction ) will be followed clinical ly . Patients will be followed in clinic or by phone biannually ( up to 3 years ) . The primary endpoint is improvement in abdominal pain . The secondary endpoints are reduction in narcotic use , healthcare utilization , and work days missed ; return to employment ; improvement in quality of life and weight gain . Results Proposed study . Limitations Strict inclusion criteria may limit enrollment . Conclusion The proposed study represents the first trial of endoscopic stenting for symptomatic CP and ductal strictures with a credible sham procedure , assessment of multiple dimensions of pain , and psychosocial factors . Clinical Trials 2009 ; 6 : 455—463 . Objective : To report on the long-term follow-up of a r and omized clinical trial comparing pancreatic head resection according to Beger and limited pancreatic head excision combined with longitudinal pancreatico-jejunostomy according to Frey for surgical treatment of chronic pancreatitis . Summary Background Data : Resection and drainage are the 2 basic surgical principles in surgical treatment of chronic pancreatitis . They are combined to various degrees by the classic duodenum preserving pancreatic head resection ( Beger ) and limited pancreatic head excision combined with longitudinal pancreatico-jejunostomy ( Frey ) . These procedures have been evaluated in a r and omized controlled trial by our group . Long-term follow up has not been reported so far . Methods : Seventy-four patients suffering from chronic pancreatitis were initially allocated to DPHR ( n = 38 ) or LE ( n = 36 ) . This postoperative follow-up included the following parameters : mortality , quality of life ( QL ) , pain ( vali date d pain score ) , and exocrine and endocrine function . Results : Median follow-up was 104 months ( 72 - 144 ) . Seven patients were not available for follow-up ( Beger = 4 ; Frey = 3 ) . There was no significant difference in late mortality ( 31 % [ 8/26 ] versus 32 % [ 8/25 ] ) . No significant differences were found regarding QL ( global QL 66.7 [ 0–100 ] versus 58.35 [ 0–100 ] ) , pain score ( 11.25 [ 0–75 ] versus 11.25 [ 0–99.75 ] ) , exocrine ( 88 % versus 78 % ) or endocrine insufficiency ( 56 % versus 60 % ) . Conclusions : After almost 9 years ’ long-term follow-up , there was no difference regarding mortality , quality of life , pain , or exocrine or endocrine insufficiency within the 2 groups . The decision which procedure to choose should be based on the surgeon 's experience UNLABELLED Incidence and prevalence of chronic pancreatitis ( CP ) are poorly known and prospect i ve nationwide epidemiologic estimation has never been performed . AIMS To estimate prospect ively national incidence and prevalence of patients attending gastroenterologists for CP in France . PATIENTS AND METHODS Study was proposed to all of the French gastroenterologists ( N=3215 ) of whom 753 accepted to participate ( 24 % private , 40 % hospital and 36 % both ) . Were included all patients suffering from proved or suspected CP , from 04 - 2003 to 07 - 2003 . Certain diagnostic criteria were pancreatic calcifications , ductal or histological abnormalities . For all of non-responder gastroenterologists , a tracking system was used ( mail or by phone ) . RESULTS A total of 456 gastroenterologists returned at least 1 case on 1748 patients . Median patient age was 51 years ; sex-ratio was 5.07 . Median duration between the first CP sign and the inclusion was 41 months . CP cause was alcoholism ( 84 % ) , hereditary ( 1 % ) , cystic fibrosis ( 1 % ) , idiopathic ( 9 % ) , other ( 6 % ) . CP diagnosis was certain in 77 % : calcifications ( 85 % ) , ductal abnormalities ( 57 % ) , and histology ( 8 % ) . CP symptoms were : chronic abdominal pain ( 53 % ) , acute pancreatitis episodes ( 67 % ) , pseudocysts ( 40 % ) , bi-liary tract compression ( 21 % ) , diabetes mellitus ( 32 % ) , pancreatic exocrine insufficiency ( 36 % ) . Maximal annual incidence was 4,646 ( crude annual incidence : 7.7 per 100,000 ; 12.9 in male ; 2.6 in female ) and prevalence was 15,832 cases ( crude prevalence : 26.4 per 100,000 ; 43.8 in male ; 9.0 in female ) . CONCLUSION New CP patients attending gastroenterologists are about 5,000 a year . CP prevalence is about 16,000 patients ( in France : 60,400,000 inhabitants ) . Frequency of main complications is close to hospital series , confirming that results issued from these centers are not or a few biased BACKGROUND AND STUDY AIMS Endoscopic ductal decompression therapy has become an established method of treating patients with painful obstructive chronic pancreatitis . Smaller series , mostly with a medium-term follow-up period , have reported encouraging results . The present analysis presents long-term follow-up data from a large multicenter patient cohort . PATIENTS AND METHODS Patients with painful chronic pancreatitis and with ductal obstruction due to either strictures and /or stones treated endoscopically at eight different centers underwent follow-up after 2 - 12 years ( mean 4.9 years ) . The patients ' clinical data , the rate of technical success , and complications were recorded from the charts . Follow-up data were prospect ively obtained using structured question naires ; the main parameter for evaluating treatment success was a significant reduction in pain ( no pain or only weak pain ) . RESULTS Follow-up data were obtained from 1018 of 1211 patients treated ( 84 % ) with mainly strictures ( 47 % ) , stones ( 18 % ) , or strictures plus stones ( 32 % ) . At the long-term follow-up , 60 % of the patients had their endotherapy completed , 16 % were still receiving some form of endoscopic treatment , and 24 % had undergone surgery . The long-term success of endotherapy was 86 % in the entire group , but only 65 % in an intention-to-treat analysis . There were no significant differences between the patient groups with regard to either strictures , stones , or both . Pancreatic function was not positively affected by endoscopic therapy . CONCLUSIONS Endoscopic ductal decompression therapy offers relief of pain in two-thirds of the patients when it is used as the only form of treatment . One-quarter of the patients have to undergo surgery BACKGROUND & AIMS The pain pattern of chronic pancreatitis ( CP ) and its surgical implication s are discussed . The aim of this study was to ( 1 ) define typical pain patterns , ( 2 ) correlate pain patterns with the presumptive causes of the pain , and ( 3 ) compare the natural history of patients treated conservatively or surgically with respect to pain relief , pancreatic dysfunction , and clinical outcome . METHODS A cohort in this prospect i ve long-term study included 207 patients with alcoholic CP ( 91 without and 116 with surgery for pain relief ) . A clinical ly based staging system was applied to characterize pain in the evolution from onset to end-stage CP . RESULTS Average duration of CP was 17 years . In early-stage CP , episodes of recurrent ( acute ) pancreatitis predominated . Chronic pain was typically associated with local complications ( mainly pseudocysts , 84 of 155 ; 54 % ) , relieved definitely by a single ( drainage ) procedure in approximately two thirds of patients . Additional surgery was required for late pain recurrence in 39 patients ( 34 % ) , primarily symptomatic cholestasis ( 18 of 39 ; 46 % ) . All patients achieved complete pain relief in advanced CP . CONCLUSIONS In our experience , relief of chronic pain regularly follows selective surgery tailored to the presumptive pain cause or occurs spontaneously in uncomplicated advanced CP BACKGROUND Primary upper endoscopy ( EGD ) and transabdominal US ( TUS ) are often performed in patients with upper abdominal pain . OBJECTIVE Primary : Determine whether the combination of EGD and EUS was equivalent to EGD plus TUS in the diagnostic evaluation of upper abdominal pain . Secondary : Compare EUS versus TUS in detecting abdominal lesions , and compare EGD by using an oblique-viewing echoendoscope versus the st and ard , forward-viewing endoscope in detecting mucosal lesions . DESIGN Prospect i ve , paired design . SETTING Six academic endoscopy centers . PATIENTS This study involved patients with upper abdominal pain referred for endoscopy . INTERVENTION All patients had EGD , EUS , and TUS . The EGD was done using both an oblique-viewing echoendoscope and the st and ard , forward-viewing endoscope ( r and omized order ) by two separate endoscopists in a blinded fashion , followed by EUS . TUS was performed within 4 weeks of EGD/EUS , also in a blinded fashion . FOLLOW-UP telephone interviews and chart review s. MAIN OUTCOME MEASUREMENTS Diagnose possible etiology of upper abdominal pain and detect clinical ly significant lesions . RESULTS A diagnosis of the etiology of upper abdominal pain was made in 66 of 172 patients ( 38 % ) . The diagnostic rate was 42 of 66 patients ( 64 % ) for EGD plus EUS versus 41 of 66 patients ( 62 % ) for EGD plus TUS , which was statistically equivalent ( McNemar test ; P = .27 ) . One hundred ninety-eight lesions were diagnosed with either EUS or TUS . EUS was superior to TUS for visualizing the pancreas ( P < .0001 ) and for diagnosing chronic pancreatitis ( P = .03 ) . Two biliary stones were detected only by EUS . Two hundred fifty-one mucosal lesions were similarly diagnosed with EGD with either the st and ard , forward-viewing endoscope or the oblique-viewing echoendoscope ( kappa = 0.48 [ 95 % CI , .43-.54 ] ) . EGD with the st and ard , forward-viewing endoscope was preferred for biopsies . LIMITATIONS No cost analysis . CONCLUSION The combination of EGD with EUS is equivalent to EGD plus TUS for diagnosing a potential etiology of upper abdominal pain . EUS is superior to TUS for detecting chronic pancreatitis . EGD combined with EUS should be considered in the first-line diagnostic evaluation of patients with upper abdominal pain BACKGROUND Resection and drainage procedures are performed for chronic pancreatitis . After resection , pancreatic function deteriorates ; however , little is known about the effect of drainage procedures . METHODS Pancreatic function was evaluated prospect ively before and after surgery in 27 patients with duodenum-preserving resection of the head of the pancreas ( DPRHP ) , and in 12 patients with pancreatico-jejunostomy ( P-JS ) ; 18 patients with chronic pancreatitis served as controls . Results of the 2 groups were not compared because of differences in patient characteristics and indications for surgery . Endpoints were exocrine function ( fecal fat excretion , urinary PABA recovery ) , endocrine function ( oral glucose tolerance test , serum C-peptide concentrations ) , and pancreatic polypeptide secretion . RESULTS Groups were not different with respect to age and duration of symptoms . Median urinary PABA recovery was not altered significantly after surgery : DPRHP , from 40 % to 31 % ; P-JS , from 52 % to 44 % ; and controls , from 43 % to 48 % . Median fecal fat also did not change significantly : DPRHP , from 6 to 12 g/24 h ; P-JS , from 9 to 5 g/24 h ; and controls , from 6 to 7 g/24 h. Although the integrated blood glucose value did not change after DPRHP , the integrated serum C-peptide value decreased after DPRHP ( P<.02 ) . After P-JS , the integrated blood glucose value decreased ( P<.02 ) , but there was no change in integrated serum C-peptide secretion . Neither integrated blood glucose nor C peptide values were affected in controls . Insulin dependency increased ( 22 % to 33 % ) after DPRHP . Pancreatic polypeptide secretion decreased only after DPRHP ( P=.003 ) . CONCLUSIONS Surgery for chronic pancreatitis does not influence exocrine pancreatic function after either a drainage ( P-JS ) or a resection procedure ( DPRHP ) . Clinical endocrine function is not affected after DPRHP but improves after P-JS Abstract Objective To determine whether poor reporting of methods in r and omised controlled trials reflects on poor methods . Design Observational study . Setting Reports of r and omised controlled trials conducted by the Radiation Therapy Oncology Group since its establishment in 1968 . Participants The Radiation Therapy Oncology Group . Outcome measures Content of reports compared with the design features described in the protocol s for all r and omised controlled trials . Results The method ological quality of 56 r and omised controlled trials was better than reported . Adequate allocation concealment was achieved in all trials but reported in only 42 % of papers . An intention to treat analysis was done in 83 % of trials but reported in only 69 % of papers . The sample size calculation was performed in 76 % of the studies , but reported in only 16 % of papers . End points were clearly defined and α and βerrors were prespecified in 76 % and 74 % of the trials , respectively , but only reported in 10 % of the papers . The one exception was the description of drop outs , where the frequency of reporting was similar to that contained in the original statistical files of the Radiation Therapy Oncology Group . Conclusions The reporting of method ological aspects of r and omised controlled trials does not necessarily reflect the conduct of the trial . Review ing research protocol s and contacting trialists for more information may improve quality assessment BACKGROUND AND STUDY AIMS Invasive treatment for abdominal pain due to chronic pancreatitis may be either surgical or endoscopic , particularly in cases of ductal obstruction . To date , the data published on the effectiveness of these two forms of therapy have been mostly retrospective , and there have been no r and omized studies . A prospect i ve , r and omized study comparing surgery with endoscopy in patients with painful obstructive chronic pancreatitis was therefore conducted . PATIENTS AND METHODS Consecutive patients with pancreatic duct obstruction and pain were invited to participate in a r and omized trial comparing endotherapy and surgery , the latter consisting of resection and drainage procedures , depending on the patient 's individual situation . Patients who did not agree to participation and r and omization were also further assessed using the same follow-up protocol . RESULTS Of 140 eligible patients , only 72 agreed to be r and omized . Surgery consisted of resection ( 80 % ) and drainage ( 20 % ) procedures , while endotherapy included sphincterotomy and stenting ( 52 % ) and /or stone removal ( 23 % ) . In the entire group , the initial success rates were similar for both groups , but at the 5-year follow-up , complete absence of pain was more frequent after surgery ( 37 % vs. 14 % ) , with the rate of partial relief being similar ( 49 % vs. 51 % ) . In the r and omized subgroup , results were similar ( pain absence 34 % after surgery vs. 15 % after endotherapy , relief 52 % after surgery vs. 46 % after endotherapy ) . The increase in body weight was also greater by 20 - 25 % in the surgical group , while new-onset diabetes developed with similar frequency in both groups ( 34 - 43 % ) , again with no differences between the results for the whole group and the r and omized subgroup . CONCLUSIONS Surgery is superior to endotherapy for long-term pain reduction in patients with painful obstructive chronic pancreatitis . Better selection of patients for endotherapy may be helpful in order to maximize results . Due to its low degree of invasiveness , however , endotherapy can be offered as a first-line treatment , with surgery being performed in case of failure and /or recurrence BACKGROUND & AIMS A r and omized trial that compared endoscopic and surgical drainage of the pancreatic duct in patients with advanced chronic pancreatitis reported a significant benefit of surgery after a 2-year follow-up period . We evaluated the long-term outcome of these patients after 5 years . METHODS Between 2000 and 2004 , 39 symptomatic patients were r and omly assigned to groups that underwent endoscopic drainage or operative pancreaticojejunostomy . In 2009 , information was collected regarding pain , quality of life , morbidity , mortality , length of hospital stay , number of procedures undergone , changes in pancreatic function , and costs . Analysis was performed according to an intention-to-treat principle . RESULTS During the 79-month follow-up period , one patient was lost and 7 died from unrelated causes . Of the patients treated by endoscopy , 68 % required additional drainage compared with 5 % in the surgery group ( P = .001 ) . Hospital stay and costs were comparable , but overall , patients assigned to endoscopy underwent more procedures ( median , 12 vs 4 ; P = .001 ) . Moreover , 47 % of the patients in the endoscopy group eventually underwent surgery . Although the mean difference in Izbicki pain scores was no longer significant ( 39 vs 22 ; P = .12 ) , surgery was still superior in terms of pain relief ( 80 % vs 38 % ; P = .042 ) . Levels of quality of life and pancreatic function were comparable . CONCLUSIONS In the long term , symptomatic patients with advanced chronic pancreatitis who underwent surgery as the initial treatment for pancreatic duct obstruction had more relief from pain , with fewer procedures , than patients who were treated endoscopically . Importantly , almost half of the patients who were treated with endoscopy eventually underwent surgery Background In current practice , patients with chronic pancreatitis undergo surgical intervention in a late stage of the disease , when conservative treatment and endoscopic interventions have failed . Recent evidence suggests that surgical intervention early on in the disease benefits patients in terms of better pain control and preservation of pancreatic function . Therefore , we design ed a r and omized controlled trial to evaluate the benefits , risks and costs of early surgical intervention compared to the current stepwise practice for chronic pancreatitis . Methods / design The ESCAPE trial is a r and omized controlled , parallel , superiority multicenter trial . Patients with chronic pancreatitis , a dilated pancreatic duct ( ≥ 5 mm ) and moderate pain and /or frequent flare-ups will be registered and followed monthly as potential c and i date s for the trial . When a registered patient meets the r and omization criteria ( i.e. need for opioid analgesics ) the patient will be r and omized to either early surgical intervention ( group A ) or optimal current step-up practice ( group B ) . An expert panel of chronic pancreatitis specialists will oversee the assessment of eligibility and ensure that allocation to either treatment arm is possible . Patients in group A will undergo pancreaticojejunostomy or a Frey-procedure in case of an enlarged pancreatic head ( ≥ 4 cm ) . Patients in group B will undergo a step-up practice of optimal medical treatment , if needed followed by endoscopic interventions , and if needed followed by surgery , according to predefined criteria . Primary outcome is pain assessed with the Izbicki pain score during a follow-up of 18 months . Secondary outcomes include complications , mortality , total direct and indirect costs , quality of life , pancreatic insufficiency , alternative pain scales , length of hospital admission , number of interventions and pancreatitis flare-ups . For the sample size calculation we defined a minimal clinical ly relevant difference in the primary endpoint as a difference of at least 15 points on the Izbicki pain score during follow-up . To detect this difference a total of 88 patients will be r and omized ( alpha 0.05 , power 90 % , drop-out 10 % ) . Discussion The ESCAPE trial will investigate whether early surgery in chronic pancreatitis is beneficial in terms of pain relief , pancreatic function and quality of life , compared with current step-up practice .Trial registration IS RCT N : IS RCT
11,062
31,739,315
Other Interventions for Motor Therapy The remaining studies compared various interventions for rehabilitation of motor deficits with control
Stroke affects nearly 800000 individuals annually in the United States . It is the fifth most common cause of death and a leading cause of long-term disability ( 1 ) . Although younger patients may be more physically capable of recovering from stroke than older patients , poor functional outcomes are commonplace . Approximately 44 % of persons aged 18 to 50 years experience moderate disability after stroke , requiring some assistance with activities of daily living ( ADLs ) or mobility ( 2 ) . In a study that followed patients from hospital admission to discharge , only 28 % of patients with ischemic stroke deemed mild or improving returned home , whereas 16 % required admission to short-term rehabilitation facilities and 11 % were admitted to skilled nursing facilities ( 3 ) . Common stroke presentations include focal weakness and numbness , facial weakness or slurred speech , and vision loss or neglect . Long-term sequelae can include weakness , spasticity , vision loss , dysphagia , and mood disorders . The prevalence of poststroke depression ( PSD ) ranges from 5 % to 67 % , depending on time since stroke and criteria used to diagnose PSD ( 4 ) . Early management and rehabilitation are essential to minimize disability , decrease risk for further complications , and optimize regain of function ( 5 , 6 ) .
Background . A recent Cochrane Review showed that early robotic training of the upper limb in stroke survivors can be more effective than other interventions when improving activities of daily living involving the arm function is the aim of therapy . Objective . We tested for efficacy of the study a protocol which involved the use of the NeReBot therapy in partial substitution of st and ard upper limb rehabilitation in post – acute stroke patients . Methods . In this dose-matched , r and omized controlled clinical trial , 34 hemiparetic participants with movement against gravity in shoulder , elbow , and wrist muscle groups were enrolled within 15 days of the onset of stroke . All participants received a total daily rehabilitation treatment for 120 minutes , 5 days per week for 5 weeks . The control group received st and ard therapy for the upper limb . The experimental group received st and ard therapy ( 65 % of exercise time ) associated with robotic training ( 35 % of exercise time ) . Muscle tone ( Modified Ashworth Scale ) , strength ( Medical Research Council ) , and synergism ( Fugl-Meyer motor scores ) were measured at impairment level , whereas dexterity ( Box and Block Test and Frenchay Arm Test ) and activities of daily living ( Functional Independence Measure ) were measured at activity level . All assessment s were performed at baseline , at the end of therapy ( time T1 ) , at 3 months ( time T2 ) , and at 7 months ( time T3 ) after entry . All between-group analyses were tested using nonparametric test with Bonferroni ’s adjustments for multiple testing . Results . No significant between-group differences were found with respect to demographic characteristics , motor , dexterity , and ADLs at baseline , postintervention ( T1 ) and at follow-up ( T2 and T3 ) . Conclusions . The robot therapy by NeReBot did not lead to better outcomes compared with conventional inpatient rehabilitation Background and Purpose — It is unknown whether one method of neuromuscular electrical stimulation for poststroke upper limb rehabilitation is more effective than another . Our aim was to compare the effects of contralaterally controlled functional electrical stimulation ( CCFES ) with cyclic neuromuscular electrical stimulation ( cNMES ) . Methods — Stroke patients with chronic ( > 6 months ) moderate to severe upper extremity hemiparesis ( n=80 ) were r and omized to receive 10 sessions/wk of CCFES- or cNMES-assisted h and opening exercise at home plus 20 sessions of functional task practice in the laboratory for 12 weeks . The task practice for the CCFES group was stimulation assisted . The primary outcome was change in Box and Block Test ( BBT ) score at 6 months post treatment . Upper extremity Fugl – Meyer and Arm Motor Abilities Test were also measured . Results — At 6 months post treatment , the CCFES group had greater improvement on the BBT , 4.6 ( 95 % confidence interval [ CI ] , 2.2–7.0 ) , than the cNMES group , 1.8 ( 95 % CI , 0.6–3.0 ) , between-group difference of 2.8 ( 95 % CI , 0.1–5.5 ) , P=0.045 . No significant between-group difference was found for the upper extremity Fugl – Meyer ( P=0.888 ) or Arm Motor Abilities Test ( P=0.096 ) . Participants who had the largest improvements on BBT were < 2 years post stroke with moderate ( ie , not severe ) h and impairment at baseline . Among these , the 6-month post-treatment BBT gains of the CCFES group , 9.6 ( 95 % CI , 5.6–13.6 ) , were greater than those of the cNMES group , 4.1 ( 95 % CI , 1.7–6.5 ) , between-group difference of 5.5 ( 95 % CI , 0.8–10.2 ) , P=0.023 . Conclusions — CCFES improved h and dexterity more than cNMES in chronic stroke survivors . Clinical Trial Registration — URL : http://www . clinical trials.gov . Unique identifier : NCT00891319 Background Effective poststroke motor rehabilitation depends on repeated limb practice with voluntary efforts . An electromyography (EMG)-driven neuromuscular electrical stimulation (NMES)-robot arm was design ed for the multi-joint physical training on the elbow , the wrist , and the fingers . Objectives To investigate the training effects of the device-assisted approach on subacute stroke patients and to compare the effects with those achieved by the traditional physical treatments . Method This study was a pilot r and omized controlled trial with a 3-month follow-up . Subacute stroke participants were r and omly assigned into two groups , and then received 20-session upper limb training with the EMG-driven NMES-robotic arm ( NMES-robot group , n = 14 ) or the time-matched traditional therapy ( the control , n = 10 ) . For the evaluation of the training effects , clinical assessment s including Fugl-Meyer Assessment ( FMA ) , Modified Ashworth Score ( MAS ) , Action Research Arm Test ( ARAT ) , and Function Independence Measurement ( FIM ) were conducted before , after the rehabilitation training , and 3 months later . Session-by-session EMG parameters in the NMES-robot group , including normalized co-contraction Indexes ( CI ) and EMG activation level of target muscles , were used to monitor the progress in muscular coordination patterns . Results Significant improvements were obtained in FMA ( full score and shoulder/elbow ) , ARAT , and FIM [ P < 0.001 , effect sizes ( EFs ) > 0.279 ] for both groups . Significant improvement in FMA wrist/h and was only observed in the NMES-robot group ( P < 0.001 , EFs = 0.435 ) after the treatments . Significant reduction in MAS wrist was observed in the NMES-robot group after the training ( P < 0.05 , EFs = 0.145 ) and the effects were maintained for 3 months . MAS scores in the control group were elevated following training ( P < 0.05 , EFs > 0.24 ) , and remained at an elevated level when assessed 3 months later . The EMG parameters indicated a release of muscle co-contraction in the muscle pairs of biceps brachii and flexor carpi radialis and biceps brachii and triceps brachii , as well as a reduction of muscle activation level in the wrist flexor in the NMES-robot group . Conclusion The NMES-robot-assisted training was effective for early stroke upper limb rehabilitation and promoted independence in the daily living comparable to the traditional physical therapy . It could achieve higher motor outcomes at the distal joints and more effective release in muscle tones than the traditional therapy . Clinical Trial Registration Clinical Trials.gov , identifier NCT02117089 ; date of registration : April 10 , 2014 Background Most stroke survivors continue to experience motor impairments even after hospital discharge . Virtual reality-based techniques have shown potential for rehabilitative training of these motor impairments . Here we assess the impact of at-home VR-based motor training on functional motor recovery , corticospinal excitability and cortical reorganization . Objective The aim of this study was to identify the effects of home-based VR-based motor rehabilitation on ( 1 ) cortical reorganization , ( 2 ) corticospinal tract , and ( 3 ) functional recovery after stroke in comparison to home-based occupational therapy . Methods We conducted a parallel-group , controlled trial to compare the effectiveness of domiciliary VR-based therapy with occupational therapy in inducing motor recovery of the upper extremities . A total of 35 participants with chronic stroke underwent 3 weeks of home-based treatment . A group of subjects was trained using a VR-based system for motor rehabilitation , while the control group followed a conventional therapy . Motor function was evaluated at baseline , after the intervention , and at 12-weeks follow-up . In a subgroup of subjects , we used Navigated Brain Stimulation ( NBS ) procedures to measure the effect of the interventions on corticospinal excitability and cortical reorganization . Results Results from the system ’s recordings and clinical evaluation showed significantly greater functional recovery for the experimental group when compared with the control group ( 1.53 , SD 2.4 in Chedoke Arm and H and Activity Inventory ) . However , functional improvements did not reach clinical significance . After the therapy , physiological measures obtained from a subgroup of subjects revealed an increased corticospinal excitability for distal muscles driven by the pathological hemisphere , that is , abductor pollicis brevis . We also observed a displacement of the centroid of the cortical map for each tested muscle in the damaged hemisphere , which strongly correlated with improvements in clinical scales . Conclusions These findings suggest that , in chronic stages , remote delivery of customized VR-based motor training promotes functional gains that are accompanied by neuroplastic changes . Trial Registration International St and ard R and omized Controlled Trial Number NCT02699398 ( Archived by Clinical Trials.gov at https:// clinical trials.gov/ct2/show/NCT02699398?term=NCT02699398&rank=1 OBJECTIVE To investigate the effectiveness of neuromuscular electrical stimulation ( NMES ) with or without other interventions in improving lower limb activity after chronic stroke . DATA SOURCES Electronic data bases , including PubMed , EMBase , Cochrane Library , PEDro ( Physiotherapy Evidence Data base ) , and PsycINFO , were search ed from the inception to January 2017 . STUDY SELECTION We selected the r and omized controlled trials ( RCTs ) involving chronic stroke survivors with lower limb dysfunction and comparing NMES or combined with other interventions with a control group of no electrical stimulation treatment . DATA EXTRACTION The primary outcome was defined as lower limb motor function , and the secondary outcomes included gait speed , Berg Balance Scale , timed Up and Go , 6-minute walk test , Modified Ashworth Scale , and range of motion . DATA SYNTHESIS Twenty-one RCTs involving 1481 participants were identified from 5759 retrieved articles . Pooled analysis showed that NMES had a moderate but statistically significant benefit on lower limb motor function ( st and ard mean difference 0.42 , 95 % confidence interval 0.26 - 0.58 ) , especially when NMES was combined with other interventions or treatment time within either 6 or 12 weeks . NMES also had significant benefits on gait speed , balance , spasticity , and range of motion but had no significant difference in walking endurance after NMES . CONCLUSIONS NMES combined with or without other interventions has beneficial effects in lower limb motor function in chronic stroke survivors . These data suggest that NMES should be a promising therapy to apply in chronic stroke rehabilitation to improve the capability of lower extremity in performing activities Background Late-life depression is associated with high rates of morbidity , premature mortality , disability , functional decline , caregiver burden and increased health care costs . While clinical and public health approaches are focused on prevention or early intervention strategies , the ideal method of intervention remains unclear . No study has set out to evaluate the role of neurobiological agents in preventing depressive symptoms in older population s at risk of depression . Methods / Design Subjects with previously reported sub-threshold depressive symptoms , aged 60 to 74 years , will be screened to participate in a single-centre , double-blind , r and omised controlled trial with three parallel groups involving omega-3 fatty acid supplementation or sertraline hydrochloride , compared with matching placebo . Subjects will be excluded if they have current depression or suicide ideation ; are taking antidepressants or any supplement containing omega-3 fatty acid ; or have a prior history of stroke or other serious cerebrovascular or cardiovascular disease , neurological disease , significant psychiatric disease ( other than depression ) or neurodegenerative disease . The trial will consist of a 12 month treatment phase with follow-up at three months and 12 months to assess outcome events . At three months , subjects will undergo structural neuroimaging to assess whether treatment effects on depressive symptoms correlate with brain changes . Additionally , proton spectroscopy techniques will be used to capture brain-imaging markers of the biological effects of the interventions . The trial will be conducted in urban New South Wales , Australia , and will recruit a community-based sample of 450 adults . Using intention-to-treat methods , the primary endpoint is an absence of clinical ly relevant depression scores at 12 months between the omega-3 fatty acid and sertraline interventions and the placebo condition . Discussion The current health , social and economic costs of late-life depression make prevention imperative from a public health perspective . This innovative trial aims to address the long-neglected area of prevention of depression in older adults . The interventions are targeted to the pathophysiology of disease , and regardless of the effect size of treatment , the outcomes will offer major scientific advances regarding the neurobiological action of these agents . The main results are expected to be available in 2017.Trial Registration Australian and New Zeal and Clinical Trials Registry ACTRN12610000032055 ( 12 January 2010 Background . Timely provision of an ankle-foot orthosis ( AFO ) orthotist customized for individuals early after stroke can be problematic . Objective . To evaluate the efficacy of a therapist-made AFO ( SWIFT Cast ) for walking recovery . Methods . This was a r and omized controlled , observer-blind trial . Participants ( n = 105 ) were recruited 3 to 42 days poststroke . All received conventional physical therapy ( CPT ) that included use of “ off-the-shelf ” and orthotist-made AFOs . People allocated to the experimental group also received a SWIFT Cast for up to 6 weeks . Measures were undertaken before r and omization , 6 weeks thereafter ( outcome ) , and at 6 months after stroke ( follow-up ) . The primary measure was walking speed . Clinical efficacy evaluation used analysis of covariance . Results . Use of a SWIFT Cast during CPT sessions was significantly higher ( P < .001 ) for the SWIFT Cast ( 55 % ) than the CPT group ( 3 % ) . The CPT group used an AFO in 26 % of CPT sessions , compared with 11 % for the SWIFT Cast group ( P = .005 ) . At outcome , walking speed was 0.42 ( st and ard deviation [ SD ] = 0.37 ) m/s for the CPT group and 0.32 ( SD = 0.34 ) m/s for the SWIFT Cast group . Follow-up walking speed was 0.53 ( SD = 0.38 ) m/s for the CPT group and 0.43 ( 0.34 ) m/s for the SWIFT Cast group . Differences , after accounting for minimization factors , were insignificant at outcome ( P = .345 ) and follow-up ( P = .360 ) . Conclusion and implication s. SWIFT Cast did not enhance the benefit of CPT , but the control group had greater use of another AFO . However , SWIFT Cast remains a clinical option because it is low cost and custom-made by therapists who can readily adapt it during the rehabilitation period Background and Purpose — This study assessed whether cycling induced by functional electrical stimulation ( FES ) was more effective than passive cycling with placebo stimulation in promoting motor recovery and walking ability in postacute hemiparetic patients . Methods — In a double-blind , r and omized , controlled trial , 35 patients were included and r and omized to receive FES-induced cycling training or placebo FES cycling . The 4-week treatment consisted of 20 sessions lasting 25 minutes each . Primary outcome measures included the leg subscale of the Motricity Index and gait speed during a 50-meter walking test . Secondary outcomes were the Trunk Control Test , the Upright Motor Control Test , the mean work produced by the paretic leg , and the unbalance in mechanical work between paretic and nonparetic legs during voluntary pedaling . Participants were evaluated before training , after training , and at 3- to 5-month follow-up visits . Results — No significant differences were found between groups at baseline . Repeated- measures ANOVA ( P<0.05 ) revealed significant increases in Motricity Index , Trunk Control Test , Upright Motor Control Test , gait speed , and mean work of the paretic leg after training and at follow-up assessment s for FES-treated patients . No outcome measures demonstrated significant improvements after training in the placebo group . Both groups showed no significant differences between assessment s after training and at follow-up . A main effect favoring FES-treated patients was demonstrated by repeated- measures ANCOVA for Motricity Index ( P<0.001 ) , Trunk Control Test ( P=0.001 ) , Upright Motor Control Test ( P=0.005 ) , and pedaling unbalance ( P=0.038 ) . Conclusions — The study demonstrated that 20 sessions of FES cycling training significantly improved lower extremity motor functions and accelerated the recovery of overground locomotion in postacute hemiparetic patients . Improvements were maintained at follow-up Objective . To investigate the effectiveness of four-channel FES based on a normal gait pattern on improving functional ability in subjects early after ischemic stroke . Methods . Forty-five subjects were r and omly assigned into a four-channel FES group ( n = 16 ) , a placebo group ( n = 15 ) , or a dual-channel group ( n = 14 ) . Stimulation lasted for 30 min in each session with 1 session/day , 5 days a week for 3 weeks . All subjects were assessed at baseline , at 3 weeks of treatment , and at 3 months after the treatment had finished . The assessment s included Fugl-Meyer Assessment ( FMA ) , the Postural Assessment Scale for Stroke Patients ( PASS ) , Berg Balance Scale ( BBS ) , Functional Ambulation Category ( FAC ) , and the Modified Barthel Index ( MBI ) . Results . All 3 groups demonstrated significant improvements in all outcome measurements from pre- to posttreatment and further gains at followup . The score of FMA and MBI improved significantly in the four-channel group at the end of the 3 weeks of training . And the scores of PASS , BBS , MBI , and FAC in the four-channel group were significantly higher than those of the placebo group . Conclusions . This study indicated that four-channel FES can improve motor function , balance , walking ability , and performance of activities of daily living in subjects with early ischemic stroke Background : We aim ed to research the value of extended nursing for cerebral stroke patients within a suitable recovery empty period . Methods : Seventy-two cerebral stroke patients were r and omized to a control group or treatment group at the recovery period at Xuzhou Recovery Hospital , China in 2016 . A recovery guidance exercise was applied to the control group for a set time , while a recovery guidance exercise combined with functional training were applied to the treatment group within the recovery empty period ( at 6:00–7:00 a.m. and 7:00–8:00 p.m. ) . The recovery effect was compared after three months . Results : Following the three-month intervention , both the control and treatment groups ’ scores for the Fugl-Meyer balance evaluation and the Barthel indicator were increased . There was a statistically significant increase in the treatment group ( P<0.05 ) . Scores for the Self-Rating Depression Scale in both groups declined and the decline in the treatment group was statistically significant greater when compared to the control group ( P<0.05 ) . The total depression rate for the treatment group was significantly lower than the control group and the severe extent of depression in the treatment group was significantly less than the control group ( P<0.05 ) . Both groups ’ scores for the PSQI also decreased with a significantly greater increase in the treatment group ( P<0.05 ) . Conclusion : Extended nursing within a suitable recovery empty period can improve the patient ’s prognosis concerning physical activity and mood Abstract Purpose : To investigate the feasibility of combining physiotherapy and functional electrical stimulation to improve gait post stroke . Methods : A parallel group partially single-blinded r and omised clinical trial . Adults living at home , less than 6 months post stroke , were r and omised to Group A ( physiotherapy , n = 10 ) or Group B ( physiotherapy and common peroneal nerve stimulation , n = 10 ) . Assessment s were conducted before r and omisation ( Week 1 ) , after intervention ( Week 8) and after 12 weeks follow-up ( Week 20 ) . Results : No between group differences were observed . There were statistically significant within group differences after the intervention period in both groups for walking speed and distance walked ( without stimulation ) , Rivermead Mobility Index and Canadian Occupational Performance Measure , maintained at Week 20 . There was statistically significant improvement in 10-m walking speed ( Group B ) when the stimulator was used at Week 8 ( p = 0.03 , median 0.04 m/s ( 8 % ) ) . Only Group B had statistically significant within group change in Rivermead Visual Gait Analysis ( Week 8) , maintained at Week 20 . Conclusions : Integrating electrical stimulation and physiotherapy was feasible and improved walking speed . There was no evidence of a training effect compared with physiotherapy alone . One-hundred forty-four participants per group would produce an adequately powered study based on this protocol . Implication s for Rehabilitation At the end of the intervention period participants using electrical stimulation to correct dropped foot walked faster . It was feasible for electrical stimulation to be combined with physiotherapy for people less than 6 months post stroke . A larger adequately powered study is required to establish whether there are training effects associated with use of stimulation in this population BACKGROUND Aquatic exercise programs are used in rehabilitation and might help to reduce disability after stroke . This was a r and omized intervention trial to assess the influence of an aquatic exercise program on people suffering from depression and anxiety after ischemic stroke . METHODS Participants were r and omized to an experimental group ( EG ) composed of 19 individuals ( 51.8±8.5 years ; ten males and nine females ) , and a control group ( CG ) composed of 17 people ( 52.7±6.7 years ; nine males and eight females ) . The aquatic exercise program consisted of two sessions per week , each lasting between 45 and 60 minutes and divided into 5 to 10 minutes exercise sections during 12 weeks . The State-Trait Anxiety Inventory was used to determine anxiety levels while the Beck Depression Inventory was used as a self- assessment of depression . RESULTS EG improved measures of depression , anxiety trait and anxiety state between pre- and post-treatment , with no changes in CG . EG improved in all tests related to functional capacity compared to CG . CONCLUSIONS The practice of aquatic exercises promotes improvements in the levels of depression and anxiety in people who suffered an ischemic stroke Background and Purpose — Depression after stroke is prevalent , diminishing recovery and quality of life . Brief behavioral intervention , adjunctive to antidepressant therapy , has not been well evaluated for long-term efficacy in those with poststroke depression . Methods — One hundred one clinical ly depressed patients with ischemic stroke within 4 months of index stroke were r and omly assigned to an 8-week brief psychosocial – behavioral intervention plus antidepressant or usual care , including antidepressant . The primary end point was reduction in depressive symptom severity at 12 months after entry . Results — Hamilton Rating Scale for Depression raw score in the intervention group was significantly lower immediately posttreatment ( P<0.001 ) and at 12 months ( P=0.05 ) compared with control subjects . Remission ( Hamilton Rating Scale for Depression < 10 ) was significantly greater immediately posttreatment and at 12 months in the intervention group compared with the usual care control . The mean percent decrease ( 47%±26 % intervention versus 32%±36 % control , P=0.02 ) and the mean absolute decrease ( −9.2±5.7 intervention versus −6.2±6.4 control , P=0.023 ) in Hamilton Rating Scale for Depression at 12 months were clinical ly important and statistically significant in the intervention group compared with control . Conclusion — A brief psychosocial – behavioral intervention is highly effective in reducing depression in both the short and long term OBJECTIVE Systematic review s suggest that mental practice as an additional therapy for people with stroke might be effective and suggest that more trials with better defined interventions are needed . This study investigated whether imagining the skilled movement systematic ally can contribute to a quicker and /or better recovery of stroke patients in long term care . DESIGN A multicenter r and omized controlled trial . SETTING Dutch nursing homes . PARTICIPANTS Stroke patients in the subacute phase of recovery . INTERVENTIONS Study participants were r and omly assigned to the control or experimental group . Over a 6-week intervention period , both groups received multi professional therapy as usual . Additionally , patients in the experimental group had instruction on mental practice with a 4-step framework embedded in regular therapy time . MAIN OUTCOME Outcomes were assessed at 6 weeks and 6 months with the patient-perceived effect on performance of daily activities ( 10-point Numeric Rating Scale ) . Six secondary outcomes on impairment and activity level were also assessed . Primary analyses were performed according to the intention-to-treat principle . Generalized estimating equations ( GEE ) were used to analyze effects . RESULTS Thirty-six adult stroke patients ( average age 77.8 , ± 7.2 years ) participated in the trial . No effect in favor of the mental practice intervention on any outcome measure could be detected at either measuring points . CONCLUSIONS This study could not show differences between embedded mental practice and current st and ard of care . However , stroke pathways in Dutch nursing homes select specific and frail patients , which might have reduced the effects of training BACKGROUND AND PURPOSE The purpose of this study was to compare the long-term effect of five daily sessions of 1 vs. 3 Hz repetitive transcranial magnetic stimulation ( rTMS ) on motor recovery in acute stroke . METHODS A total of 36 patients with acute ischaemic stroke participated in the study . The patients were r and omly assigned into one of three groups ; the first and second groups received real rTMS ; 1 and 3 Hz and third group received sham stimulation , daily for 5 days . Motor disability was assessed before and after the last session , and then after first , second and third month . Cortical excitability was assessed before and after the second and fifth session . The outcome measure was clinical disability at 3 months post-rTMS . RESULTS No significant differences were found in basal rating scales between the three groups . At the 3-month time point , both of the real rTMS groups had improved significantly more in different rating scales than the sham group ; in addition , the 1 Hz group performed better than the 3 Hz group . Measures of cortical excitability immediately after the last session showed that the 1 Hz group had reduced excitability of the non-stroke hemisphere and increased excitability of the stroke hemisphere , whereas the 3 Hz group only showed increased excitability of the stroke hemisphere . CONCLUSION These results confirm that five daily sessions of rTMS over motor cortex using either 1 Hz over the unaffected hemisphere or 3 Hz over the affected hemisphere can enhance recovery . At 3 months , the improvement was more pronounced in 1 Hz group INTRODUCTION Paretic upper limb in stroke patients has a significant impact on the quality of life . Modified Constraint Induced Movement Therapy ( mCIMT ) is one of the treatment options used for the improvement of the function of the paretic limb . AIM To investigate the efficacy of four week duration mCIMT in the management of upper extremity weakness in hemiparetic patients due to stroke . MATERIAL S AND METHODS Prospect i ve single blind , parallel r and omized controlled trial in which 30 patients received conventional rehabilitation programme ( control group ) and 30 patients participated in a mCIMT programme in addition to the conventional rehabilitation programme ( study group ) . The mCIMT included three hours therapy sessions emphasizing the affected arm use in general functional tasks , three times a week for four weeks . Their normal arm was also constrained for five hours per day over five days per week . All the patients were assessed at baseline , one month and three months after completion of therapy using Fugl-Meyer Assessment ( FMA ) score for upper extremity and Motor Activity Log ( MAL ) scale comprising of Amount of Use ( AOU ) score and Quality of Use ( QOU ) score . RESULTS All the 3 scores improved significantly in both the groups at each follow-up . Post-hoc analysis revealed that compared to conventional rehabilitation group , mCIMT group showed significantly better scores at 1 month { FMA1 ( p-value < 0.0001 , es0.2870 ) , AOU1 ( p-value 0.0007 , es0.1830 ) , QOU1 ( p-value 0.0015 , es0.1640 ) } and 3 months { FMA3 ( p-value < .0001 , es0.4240 ) , AOU3 ( p-value 0.0003 , es 0.2030 ) , QOU3 ( p-value 0.0008 , es 0.1790)}. CONCLUSION Four weeks duration for mCIMT is effective in improving the motor function in paretic upper limb of stroke patients BACKGROUND Mood and emotional disturbances are common in patients with stroke , and adversely affect the clinical outcome . We aim ed to evaluate the efficacy of early administration of escitalopram to reduce moderate or severe depressive symptoms and improve emotional and neurological dysfunction in patients with stroke . METHODS This was a placebo controlled , double-blind trial done at 17 centres in South Korea . Patients who had had an acute stroke within the past 21 days were r and omly assigned in a 1:1 ratio to receive oral escitalopram ( 10 mg/day ) or placebo for 3 months . R and omisation was done with permuted blocks stratified by centre , via a web-based system . The primary endpoint was the frequency of moderate or severe depressive symptoms ( Montgomery-Åsberg Depression Rating Scale [ MADRS ] ≥16 ) . Endpoints were assessed at 3 months after r and omisation in the full analysis set ( patients who took study medication and underwent assessment of primary endpoint after r and omisation ) , in all patients who were enrolled and r and omly assigned ( intention to treat ) , and in all patients who completed the trial ( per- protocol analysis ) . This trial is registered with Clinical Trials.gov , number NCT01278498 . FINDINGS Between Jan 27 , 2011 , and June 30 , 2014 , 478 patients were assigned to placebo ( n=237 ) or escitalopram ( n=241 ) ; 405 were included in the full analysis set ( 195 in the placebo group , 210 in the escitalopram group ) . The primary outcome did not differ by study group in the full analysis set ( 25 [ 13 % ] patients in the placebo group vs 27 [ 13 % ] in the escitalopram group ; odds ratio [ OR ] 1·00 , 95 % CI 0·56 - 1·80 ; p>0·99 ) or in the intention-to-treat analysis ( 34 [ 14 % ] vs 35 [ 15 % ] ; OR 1·01 , 95 % CI 0·61 - 1·69 , p=0·96 ) . The study medication was generally well tolerated ; the most common adverse events were constipation ( 14 [ 6 % ] patients who received placebo vs 14 [ 6 % ] who received escitalopram ) , muscle pain ( 16 [ 7 % ] vs ten [ 4 % ] ) , and insomnia ( 12 [ 5 % ] vs 12 [ 5 % ] ) . Diarrhoea was more common in the escitalopram group ( nine [ 4 % ] patients ) than in the placebo group ( two [ 1 % ] patients ) . INTERPRETATION Escitalopram did not significantly reduce moderate or severe depressive symptoms in patients with acute stroke . FUNDING Dong-A Pharmaceutical and Ministry for Health , Welfare , and Family Affairs , South Korea ABSTRACT Objective : To examine the effects of Solution Focused Brief Therapy ( SFBT ) in individuals after stroke on self-efficacy , symptoms of depression and anxiety . Design : R and omized controlled trial . Setting : Clinic of Adult Neurology of Medical University of Gdańsk and M. Copernicus Pomeranian Traumatology Centre in Gdańsk . Subjects : A total of 62 patients , aged 54.0 ± 9.6 years . Interventions : They were r and omly assigned to one of the two groups : SFBT participating in 10 therapy sessions and control – not participating in any psychotherapy . Main measures : Symptoms of depression and anxiety according to Hospital Anxiety and Depression Scale , Mini-Mental Adjustment to Cancer ( scale originally design ed for cancer patients ) and Self-efficacy Scale were examined at baseline of the study and later in the same time intervals in both groups . Results : The intensity of depression and anxiety complaints drops in the SFBT group ( from 5.0 to 2.0 and 8.0 to 4.0 respectively ; both p < .001 Friedman ’s ANOVA ( analysis of variance ) ) whilst in the control group remains unchanged . In addition to the gradual reduction of destructive attitudes ( from 34.5 to 17.0 ) , the increase in the number of constructive attitudes ( from 42.0 to 50.5 ) and increased self-efficacy ( from 79.0 to 96.0 ) was observed after therapy but not in the control group . Conclusions : The authors suggest SFBT as a simple , beneficial and inexpensive method to manage patients after stroke BACKGROUND Depression and anxiety are common after stroke . There is inconclusive evidence of the benefit of psychotherapy for poststroke depression and anxiety . Here , we used a brief intervention , Neuro-Linguistic Programming ( NLP ) brief therapy plus health education , to evaluate the changes in patients with ischemic stroke . METHODS One hundred eighty patients were r and omly allocated to receive 4 sessions of NLP plus health education ( n = 90 ) or usual care ( n = 90 ) . A set of question naires was used preintervention and postintervention as well as at the 6-month follow-up . The primary outcomes were the prevalence of depression and anxiety , and the awareness of stroke knowledge . RESULTS More patients in the intervention group achieved remission of depressive ( odds ratio [ OR ] , 2.81 ; 95 % confidence interval [ CI ] , 1.41 - 5.59 ) and anxious symptoms ( OR , 2.19 ; 95 % CI , 1.15 - 4.18 ) after intervention . At the 6-month follow-up , we found no differences between groups in both the prevalence of depression and anxiety . After intervention , the intervention group had better awareness rates on most of the stroke knowledge items ( P < .05 ) . It also had better quality of life and physical function both after intervention and at the follow-up ( P < .05 ) . CONCLUSIONS NLP plus health education could reduce depression and anxiety immediately after intervention , but not at the 6-month follow-up . The intervention could also improve the awareness of stroke knowledge and benefit patients on quality of life and physical function OBJECTIVES More than 50 % of patients with upper limb paresis after stroke face long-term impaired arm function and ensuing disability in daily life . This study aims to evaluate the effectiveness of a task-oriented mental practice ( MP ) approach as an addition to regular arm-h and therapy in patients with subacute stroke . METHODS A multicenter , prospect i ve , single-blind , r and omized clinical trial was performed . Patients trained for 6 weeks , at least 3 times per day . In the experimental group , patients performed video-instructed MP . In the control group , patients performed neurodevelopmental therapy-based exercise therapy . The primary outcome measures are Fugl-Meyer test , Frenchay arm test , Wolf motor function test , and accelerometry . RESULTS The patients did improve over time on Fugl-Meyer test and Wolf motor function test in both the control and the experimental group . A significant improvement on the Frenchay arm test was found after training ( which was maintained at 12-month follow-up ) only in the experimental group . However , no difference in training effects between groups was demonstrated . CONCLUSIONS Training effects were demonstrated after MP training in patients with subacute stroke . However , the results of this study do not corroborate the hypothesis that the use of MP in addition to therapy as usual in patients with subacute stroke has an additional effect over neurodevelopmental therapy in addition to therapy as usual OBJECTIVE The aim of this study was to investigate the effects of virtual reality ( VR ) balance training conducted using Kinect for Xbox ® games on patients with chronic stroke . MATERIAL S AND METHODS Fifty patients with mild to moderate motor deficits were recruited and r and omly assigned to two groups : VR plus st and ard treatment group and st and ard treatment ( ST ) group . In total , 12 training sessions ( 90 minutes a session , twice a week ) were conducted in both groups , and performance was assessed at three time points ( pretest , post-test , and follow-up ) by a blinded assessor . The outcome measures were the Berg Balance Scale ( BBS ) , Functional Reach Test , and Timed Up and Go Test ( cognitive ; TUG-cog ) for balance evaluations ; Modified Barthel Index for activities of daily living ability ; Activities-specific Balance Confidence Scale for balance confidence ; and Stroke Impact Scale for quality of life . The pleasure scale and adverse events were also recorded after each training session . RESULTS Both groups exhibited significant improvement over time in the BBS ( P = 0.000 ) and TUG-cog test ( P = 0.005 ) . The VR group rated the experience as more pleasurable than the ST group during the intervention ( P = 0.027 ) . However , no significant difference was observed in other outcome measures within or between the groups . No serious adverse events were observed during the treatment in either group . CONCLUSIONS VR balance training by using Kinect for Xbox games plus the traditional method had positive effects on the balance ability of patients with chronic stroke . The VR group experienced higher pleasure than the ST group during the intervention Background : About 30 % of stroke survivors clinical ly have depressive symptoms at some point following stroke and anxiety prevalence is around 20 - 25 % . Objective : The purpose of this brief report is to evaluate a pilot trial of a constructive integrative psychosocial intervention ( CIPI ) over st and ard care in post-stroke depression or anxiety . Methods : Patients were r and omly assigned to either CIPI ( n = 23 ) or st and ard care ( n = 19 ) . Patients were assessed using the Hospital Anxiety and Depression Scale at the 1st , 3rd , and 6th months to monitor changes of mood . Results : A Wilcoxon signed-rank test indicated that compared to admission baseline , patients with the intervention had significantly normal post-stroke depression symptom levels at the 1st , 3rd , and 6th months ( P < 0.005 ) . Conclusion : CIPI appears to be of incremental value in treating depression as well as anxiety in subacute care Objective There is little r and omized controlled trial ( RCT ) evidence to guide treatment for anxiety after stroke . We systematic ally review ed RCTs of anxiety interventions in acquired brain injury ( ABI ) conditions including stroke and traumatic brain injury ( TBI ) in order to summarize efficacy and key aspects of trial design to help guide future RCTs . Methods We search ed the Cochrane trial register , Medline , Embase , PsychInfo and CINAHL systematic ally up to August 2017 . Two independent review ers systematic ally selected studies and extracted data . We summarized the effect size , key study characteristics and sources of potential bias in trial design . Results 14 studies ( 12 stroke ; one stroke & TBI ; one TBI ) with 928 participants were included . Meta- analysis of five psychotherapy comparisons favoured intervention over control ( st and ardized mean difference ( SMD ) : − 0.41 [ − 0.79 , − 0.03 ] , I2 = 28 % ) ; Overall effect size of pharmacotherapy comparisons favoured intervention over control ( SMD : − 2.12 [ − 3.05 , − 1.18 ] , I2 = 89 % ) . One comparison of mixed pharmacotherapy and psychotherapy favoured intervention over usual care ( SMD : − 4.79 [ − 5.87 , − 3.71 ] ) . One comparison favoured forest therapy versus urban control ( SMD : − 2.00 [ − 2.59 , − 1.41 ] ) . All positive studies carried high or unclear risk of bias . Sample sizes were small in all included studies . Conclusions There is low quality evidence to suggest that psychotherapy and pharmacotherapy may be effective interventions in the treatment of anxiety after stroke based on underpowered studies that carried high risk of bias . Large-scale well- design ed definitive trials are needed to establish whether pharmacological or psychotherapy works . Our review highlighted key considerations for investigators wishing to design high quality trials to evaluate treatments for anxiety after stroke Objectives Robot-assisted movement training can help individuals with stroke reduce arm and h and impairment , but robot therapy is typically only about as effective as conventional therapy . Refining the way that robots assist during training may make them more effective than conventional therapy . Here , the authors measured the therapeutic effect of a robot that required individuals with a stroke to achieve virtual tasks in three dimensions against gravity . Design The robot continuously estimated how much assistance patients needed to perform the tasks and provided slightly less assistance than needed to reduce patient slacking . Individuals with a chronic stroke ( n = 26 ; baseline upper limb Fugl-Meyer score , 23 ± 8) were r and omized into two groups and underwent 24 one-hour training sessions over 2 mos . One group received the assist-as-needed robot training and the other received conventional tabletop therapy with the supervision of a physical therapist . Results Training helped both groups significantly reduce their motor impairment , as measured by the primary outcome measure , the Fugl-Meyer score , but the improvement was small ( 3.0 ± 4.9 points for robot therapy vs. 0.9 ± 1.7 for conventional therapy ) . There was a trend for greater reduction for the robot-trained group ( P = 0.07 ) . The robot group largely sustained this gain at the 3-mo follow-up . The robot-trained group also experienced significant improvements in Box and Blocks score and h and grip strength , whereas the control group did not , but these improvements were not sustained at follow-up . In addition , the robot-trained group showed a trend toward greater improvement in sensory function , as measured by the Nottingham Sensory Test ( P = 0.06 ) . Conclusions These results suggest that in patients with chronic stroke and moderate-severe deficits , assisting in three-dimensional virtual tasks with an assist-as-needed controller may make robotic training more effective than conventional tabletop training ABSTRACT Objective : To determine the effect of activity-based mirror therapy ( MT ) on motor recovery and gait in chronic poststroke hemiparetic subjects . Design : A r and omised , controlled , assessor-blinded trial . Setting : Rehabilitation institute . Participants : Thirty-six chronic poststroke ( 15.89 ± 9.01 months ) hemiparetic subjects ( age : 46.44 ± 7.89 years , 30 men and functional ambulation classification of median level 3 ) . Interventions : Activity-based MT comprised movements such as ball-rolling , rocker-board , and pedalling . The activities were provided on the less-affected side in front of the mirror while hiding the affected limb . The movement of the less-affected lower limb was projected as over the affected limb . Conventional motor therapy based on neurophysiological approaches was also provided to the experimental group . The control group received only conventional management . Main outcome measures : Brunnstrom recovery stages ( BRS ) , Fugl-Meyer assessment lower extremity ( FMA-LE ) , Rivermead visual gait assessment ( RVGA ) , and 10-metre walk test ( 10-MWT ) . Results : Postintervention , the experimental group exhibited significant and favourable changes for FMA-LE ( mean difference = 3.29 , 95 % CI = 1.23–5.35 , p = .003 ) and RVGA ( mean difference = 5.41 , 95 % CI = 1.12–9.71 , p = .015 ) in comparison to the control group . No considerable changes were observed on 10-MWT . Conclusions : Activity-based MT facilitates motor recovery of the lower limb as well as reduces gait deviations among chronic poststroke hemiparetic subjects OBJECTIVE To test whether a multi strategy intervention enhanced recovery immediately and longitudinally in patients with severe to moderate upper extremity ( UE ) paresis . DESIGN Double-blind , r and omized controlled trial with placebo control . SETTING Outpatient department of a local medical center . PARTICIPANTS People ( N=25 ) with chronic stroke were r and omly assigned to 1 of 2 groups : a transcranial direct current stimulation with sensory modulation ( tDCS-SM ) group ( n=14 ; mean age ± SD , 55.3±11.4y ) or a control group ( n=11 ; mean age ± SD , 56.9±13.5y ) . INTERVENTIONS Eight-week intervention . The tDCS-SM group received bilateral tDCS , bilateral cutaneous anesthesia , and high repetitions of passive movements on the paretic h and . The control group received the same passive movements but with sham tDCS and sham anesthesia . During the experiment , all participants continued their regular rehabilitation . MAIN OUTCOME MEASURES Voluntary UE movement , spasticity , UE function , and basic activities of daily living . Outcomes were assessed at baseline , at postintervention , and at 3- and 6-month follow-ups . RESULTS No significant differences were found between groups . However , there was a trend that the voluntary UE movement improved more in the tDCS-SM group than in the control group , with a moderate immediate effect ( partial η2 [ηp2]=.14 , P=.07 ) and moderate long-term effects ( 3-mo follow-up : ηp2=.17 , P=.05 ; 6-mo follow-up : ηp2=.12 , P=.10 ) . Compared with the control group , the tDCS-SM group had a trend of a small immediate effect ( ηp2=.02-.04 ) on reducing spasticity , but no long-term effect . A trend of small immediate and long-term effects in favor of tDCS-SM was found on UE function and daily function recovery ( ηp2=.02-.09 ) . CONCLUSIONS Accompanied with traditional rehabilitation , tDCS-SM had a nonsignificant trend of having immediate and longitudinal effects on voluntary UE movement recovery in patients with severe to moderate UE paresis after stroke , but its effects on spasticity reduction and functional recovery may be limited Objective : To determine the appropriate treatments for post-ischaemic stroke depression at different times after stroke . Design : A single-blind , r and omized , controlled trial that compared three intervention groups , with subgroups stratified by time after stroke . Setting : Outpatient clinic . Subjects : Eligible patients were recruited at discharge ( n = 73 ) and three ( n = 67 ) , six ( n = 65 ) , and nine months ( n = 69 ) after discharge , and patients completed mood question naires . Interventions : Patients were r and omly distributed into three groups : Group A received placebos and participated in general discussion s ; Group B , received citalopram and participated in general discussion s ; and Group C , received placebos and underwent cognitive behavioural therapy . All three groups participated in rehabilitation during three months of follow-up . Main measures : Outcome was assessed three months after baseline using the 17-item Hamilton Depression Scale ( HAMD17 ) and the Bech – Rafaelsen Melancholia Scale ( MES ) . During treatment , the Udvalg for Kliniske Undersogelser side-effect scale was also administered . Results : When stratification was not considered , the scores of Group B on the Melancholia Scale were lower than those of Group A ( P = 0.02 ) ; when the four time-based subgroups were analysed , significant differences were observed between Groups A and B ( PMES = 0.02 , PHAMD17 = 0.02 ) in the group recruited six months after discharge and between Groups A and C ( PMES = 0.01 ) in the last time period nine months after discharge . Conclusions : The effects of citalopram or cognitive behavioural therapy is similar to the effect of rehabilitation alone for early-onset post-ischaemic depression ; rehabilitation and citalopram for delayed-onset post-ischaemic depression ; and rehabilitation and cognitive behavioural therapy for late-onset post-ischaemic depression are more effective than rehabilitation alone Objective : To compare the effectiveness of upper extremity virtual reality rehabilitation training ( VR ) to time-matched conventional training ( CT ) in the subacute phase after stroke . Methods : In this r and omized , controlled , single-blind phase III multicenter trial , 120 participants with upper extremity motor impairment within 12 weeks after stroke were consecutively included at 5 rehabilitation institutions . Participants were r and omized to either VR or CT as an adjunct to st and ard rehabilitation and stratified according to mild to moderate or severe h and paresis , defined as ≥20 degrees wrist and 10 degrees finger extension or less , respectively . The training comprised a minimum of sixteen 60-minute sessions over 4 weeks . The primary outcome measure was the Action Research Arm Test ( ARAT ) ; secondary outcome measures were the Box and Blocks Test and Functional Independence Measure . Patients were assessed at baseline , after intervention , and at the 3-month follow-up . Results : Mean time from stroke onset for the VR group was 35 ( SD 21 ) days and for the CT group was 34 ( SD 19 ) days . There were no between-group differences for any of the outcome measures . Improvement of upper extremity motor function assessed with ARAT was similar at the postintervention ( p = 0.714 ) and follow-up ( p = 0.777 ) assessment s. Patients in VR improved 12 ( SD 11 ) points from baseline to the postintervention assessment and 17 ( SD 13 ) points from baseline to follow-up , while patients in CT improved 13 ( SD 10 ) and 17 ( SD 13 ) points , respectively . Improvement was also similar for our subgroup analysis with mild to moderate and severe upper extremity paresis . Conclusions : Additional upper extremity VR training was not superior but equally as effective as additional CT in the subacute phase after stroke . VR may constitute a motivating training alternative as a supplement to st and ard rehabilitation . Clinical Trials.gov identifier : NCT02079103 . Classification of evidence : This study provides Class I evidence that for patients with upper extremity motor impairment after stroke , compared to conventional training , VR training did not lead to significant differences in upper extremity function improvement Background The potential benefits of repetitive transcranial magnetic stimulation ( rTMS ) , applied either alone or as a combination treatment , on recovery of lower limbs after stroke have been insufficiently studied . Objective The aim of the study was to evaluate the effect of priming with 1-Hz repetitive transcranial magnetic stimulation over contralesional leg motor area with a double-cone coil before physical therapy on regaining ambulation . Methods Thirty-eight subacute stroke patients with significant leg disabilities were r and omly assigned into the experimental group or control group to receive a 15-min real or sham 1-Hz repetitive transcranial magnetic stimulation , respectively , over the contralesional motor cortex representing the quadriceps muscle followed by 45-min physical therapy for 15 sessions for 3 wks . Functional measures , motor evoked potentials , and quality of life were assessed . Results There was no significant difference between experimental group and control group regarding the recovery in ambulation , balance , motor functions , and activity of daily living . No significant difference was found in other functional measures and the quality of life . Only the control group displayed significantly increased cortical excitability of the contralesional hemisphere after the intervention . Conclusions The present study found that insufficient evidence that contralesional priming with 1-Hz repetitive transcranial magnetic stimulation improves ambulatory and other motor functions among patients with a severe leg dysfunction in subacute stroke To evaluate the effectiveness of repetitive transcranial magnetic stimulation ( rTMS ) on motor recovery after stroke using a prospect i ve , double‐blind , r and omized , sham‐controlled study BACKGROUND AND PURPOSE Repetitive transcranial magnetic stimulation ( rTMS ) changes the excitability of the motor cortex and thereby has the potential to enhance motor recovery after stroke . This r and omized , sham-controlled , double-blind study was to compare the effects of high-frequency versus low-frequency rTMS on motor recovery during the early phase of stroke and to identify the neurophysiological correlates of motor improvements . METHODS A total of 69 first-ever ischemic stroke patients with motor deficits were r and omly allocated to receive five daily sessions of 3-Hz ipsilesional rTMS , 1-Hz contralesional rTMS or sham rTMS in addition to st and ard physical therapy . Outcome measures included motor deficits , neurological scores and cortical excitability , which were assessed at baseline , after the intervention and at 3-month follow-up . RESULTS The rTMS groups manifested greater motor improvements than the control group , which were sustained for at least 3 months after the end of the treatment sessions . 1-Hz rTMS over the unaffected hemisphere produced more profound effects than 3-Hz rTMS in facilitating upper limb motor performance . There was a significant correlation between motor function improvement and motor cortex excitability change in the affected hemisphere . CONCLUSIONS Repetitive transcranial magnetic stimulation is a beneficial neurorehabilitative strategy for enhancing motor recovery in the acute and subacute phase after stroke Objective . To evaluate for any clinical effects of home-based mirror therapy and subsequent cortical reorganization in patients with chronic stroke with moderate upper extremity paresis . Methods . A total of 40 chronic stroke patients ( mean time post .onset , 3.9 years ) were r and omly assigned to the mirror group ( n = 20 ) or the control group ( n = 20 ) and then joined a 6-week training program . Both groups trained once a week under supervision of a physiotherapist at the rehabilitation center and practice d at home 1 hour daily , 5 times a week . The primary outcome measure was the Fugl-Meyer motor assessment ( FMA ) . The grip force , spasticity , pain , dexterity , h and -use in daily life , and quality of life at baseline — posttreatment and at 6 months — were all measured by a blinded assessor . Changes in neural activation patterns were assessed with functional magnetic resonance imaging ( fMRI ) at baseline and posttreatment in an available subgroup ( mirror , 12 ; control , 9 ) . Results . Posttreatment , the FMA improved more in the mirror than in the control group ( 3.6 ± 1.5 , P < .05 ) , but this improvement did not persist at follow-up . No changes were found on the other outcome measures ( all Ps > .05 ) . fMRI results showed a shift in activation balance within the primary motor cortex toward the affected hemisphere in the mirror group only ( weighted laterality index difference 0.40 ± 0.39 , P < .05 ) . Conclusion . This phase II trial showed some effectiveness for mirror therapy in chronic stroke patients and is the first to associate mirror therapy with cortical reorganization . Future research has to determine the optimum practice intensity and duration for improvements to persist and generalize to other functional domains Background . Low-frequency repetitive transcranial magnetic stimulation ( rTMS ) of the contralesional primary motor cortex ( M1 ) may improve recovery in patients with hemiparetic stroke . Objective . To evaluate the effectiveness of applying 1 Hz rTMS to the contralesional M1 in addition to physiotherapy during early rehabilitation for stroke patients with h and hemiparesis in a r and omized , sham-controlled , double-blind study . Methods . Forty patients with moderate upper extremity hemiparesis were r and omized to receive 3 weeks of motor training ( 45 minutes daily ) preceded by 30 minutes of 1 Hz rTMS applied to the contralesional M1 or 30 minutes of sham rTMS . Functional assessment of the paretic h and using the Wolf Motor Function Test was performed before , immediately after , and 3 months after completing treatment . Results . No statistically significant differences were found between the experimental and the control group for h and function ( Wolf Motor Function Test ; P = .92 ) or the level of neurological deficit ( National Institutes of Health Stroke Scale [ NIHSS ] ; P = .82 ) after treatment . Effect sizes for the experimental ( d = 0.5 ) and the control group ( d = 0.47 ) were small . Similar results were observed at the 3-month follow-up . Conclusions . The findings did not suggest that rTMS suppression of the contralesional motor cortex augments the effect of early neurorehabilitation for upper limb hemiparesis . Larger trials that stratify subjects based on residual motor function or physiological measures of excitation and inhibition may identify responders in the future Post-stroke depression ( PSD ) is a common yet severe sequela of stroke , and is often accompanied with somatic symptoms . Duloxetine , a new serotonin-norepinephrine reuptake inhibitor , may help to prevent depression after stroke . 95 ischemic stroke patients without depression were r and omly divided into two groups : duloxetine group ( n = 47 ) and control group ( n = 48 ) . Patients in the control group received routine ischemic stroke therapy , whereas patients in the duloxetine group received duloxetine ( dose range 30–90 mg ) for 12 weeks in addition to routine therapy . Follow-up observations lasted for 24 weeks . The Hamilton Depression Scale was used to measure depression , and the National Institute of Stroke Scale , Mini-Mental State Examination , Activities of Daily Living Scale ( Chinese version ) and Short Form 36 Health Survey Question naire were used to assess neurological function , cognitive function , rehabilitation from stroke and quality of life . Results showed that in general , duloxetine spared ischemic stroke patients from both minor and major depression by 16 % . In addition , duloxetine helped patients to rehabilitate more rapidly from stroke , and was associated with better cognitive function and quality of life . In conclusion , the prophylactic use of duloxetine not only decreased the incidence of PSD , but also promoted rehabilitation , cognitive function and quality of life
11,063
29,470,825
Conclusions The results of the present systematic review and meta- analysis suggest a significant effect of RT frequency as higher training frequencies are translated into greater muscular strength gains . However , these effects seem to be primarily driven by training volume because when the volume is equated , there was no significant effect of RT frequency on muscular strength gains . Thus , from a practical st and point , greater training frequencies can be used for additional RT volume , which is then likely to result in greater muscular strength gains . However , it remains unclear whether RT frequency on its own has significant effects on strength gain . It seems that higher RT frequencies result in greater gains in muscular strength on multi-joint exercises in the upper body and in women , and , finally , in contrast to older adults , young individuals seem to respond more positively to greater RT frequencies .
Background Current recommendations on resistance training ( RT ) frequency for gains in muscular strength are based on extrapolations from limited evidence on the topic , and thus their practical applicability remains question able . Objective To eluci date this issue , we conducted a systematic review and meta- analysis of the studies that compared muscular strength outcomes with different RT frequencies .
OBJECTIVE To determine the effect of frequency of resistive training on gain in muscle strength and neuromuscular performance in healthy older adults . DESIGN A r and omized controlled trial with subjects assigned either to high-intensity resistance training 1 ( EX1 ) , 2 ( EX2 ) , or 3 ( EX3 ) days per week for 24 weeks or to a control group ( CO ) . SETTING An exercise facility at an academic medical center . SUBJECTS Forty-six community-dwelling healthy men ( n = 29 ) and women ( n = 17 ) aged 65 to 79 years . INTERVENTION Progressive resistance training consisting of three sets of eight exercises targeting major muscle groups of the upper and lower body , at 80 % of one-repetition maximum ( 1-RM ) for eight repetitions , either 1 , 2 , or 3 days per week . MEASURES Dynamic muscle strength ( 1-RM ) using isotonic equipment every 4 weeks , bone mineral density and body composition by dual energy X-ray absorptiometry ( DXA ) , and neuromuscular performance by timed chair rise and 6-meter backward t and em walk . RESULTS For each of the eight exercises , muscle strength increased in the exercise groups relative to CO ( P < .01 ) , with no difference among EX1 , EX2 and EX3 groups at any measurement interval . Percent change averaged 3.9 + /- 2.4 ( CO ) , 37.0 + /- 15.2 ( EX1 ) , 41.9 + /- 18.2 ( EX2 ) , and 39.7 + /- 9.8 ( EX3 ) . The time to rise successfully from the chair 5 times decreased significantly ( P < .01 ) at 24 weeks , whereas improvement in the 6-meter backward t and em walk approached significance ( P = .10 ) in the three exercise groups compared with CO . Changes in chair rise ability were correlated to percent changes in quadriceps strength ( r = -0.40 , P < .01 ) and lean mass ( r = -0.40 , P < .01 ) . CONCLUSIONS A program of once or twice weekly resistance exercise achieves muscle strength gains similar to 3 days per week training in older adults and is associated with improved neuromuscular performance . Such improvement could potentially reduce the risk of falls and fracture in older adults Abstract Fisher , G , McCarthy , JP , Zuckerman , PA , Bryan , DR , Bickel , CS , and Hunter , GR . Frequency of combined resistance and aerobic training in older women . J Strength Cond Res 27(7 ) : 1868–1876 , 2013—The aim of this study was to determine the optimal frequency of combined aerobic and resistance training for improving muscular strength ( MS ) , cardiovascular fitness ( CF ) , and functional tasks ( FTs ) in women older than 60 years . Sixty-three women were r and omly assigned to 1 of 3 exercise training groups . Group 1 performed 1 resistance exercise training ( RET ) and 1 aerobic exercise training ( AET ) session per week ( AET/RET 1 × wk−1 ) ; group 2 performed 2 RET and 2 AET sessions per week ( AET/RET 2 × wk−1 ) ; and group 3 performed 3 RET and 3 AET sessions per week ( AET/RET 3 × wk−1 ) . MS , CF , and FT measurements were made pretraining and 16 weeks posttraining . Repeated- measures analysis of variance indicated a significant time effect for changes in MS , CF , and FT , such that all improved after training . However , there were no significant training group or training group × time interactions . Sixteen weeks of combined AET/RET ( 1 × wk−1 , 2 × wk−1 , or 3 × wk−1 ) lead to significant improvements in MS , CF , exercise economy , and FT . However , there were no significant differences for MS , CF , or FT outcomes between groups PURPOSE The purpose of this study was to compare changes in maximal strength , power , and muscular endurance after 12 wk of periodized heavy-resistance training directly supervised by a personal trainer ( SUP ) versus unsupervised training ( UNSUP ) . METHODS Twenty moderately trained men aged 24.6 + /- 1.0 yr ( mean + /- SE ) were r and omly assigned to either the SUP group ( N = 10 ) or the UNSUP group ( N = 8) . Both groups performed identical linear periodized resistance training programs consisting of preparatory ( 10 - 12 repetitions maximum ( RM ) ) , hypertrophy ( 8 to 10-RM ) , strength ( 5 to 8-RM ) , and peaking phases ( 3 to 6-RM ) using free-weight and variable-resistance machine exercises . Subjects were tested for maximal squat and bench press strength ( 1-RM ) , squat jump power output , bench press muscular endurance , and body composition at week 0 and after 12 wk of training . RESULTS Mean training loads ( kg per set ) per week were significantly ( P < 0.05 ) greater in the SUP group than the UNSUP group at weeks 7 through 11 for the squat , and weeks 3 and 7 through 12 for the bench press exercises . The rates of increase ( slope ) of squat and bench press kg per set were significantly greater in the SUP group . Maximal squat and bench press strength were significantly greater at week 12 in the SUP group . Squat and bench press 1-RM , and mean and peak power output increased significantly after training in both groups . Relative local muscular endurance ( 80 % of 1-RM ) was not compromised in either group despite significantly greater loads utilized in bench press muscular endurance testing after training . Body mass , fat mass , and fat-free mass increased significantly after training in the SUP group . CONCLUSION Directly supervised , heavy-resistance training in moderately trained men result ed in a greater rate of training load increase and magnitude which result ed in greater maximal strength gains compared with unsupervised training Abstract Seventeen subjects performed resistance training of the leg extensor and flexor muscle groups two ( 2/wk ) or three ( 3/wk ) times per week . Changes in the relative myosin heavy chain ( MHC ) isoform contents ( I , IIa and IIx ) of the vastus lateralis and isometric , isokinetic and squat-lift one-repetition maximum ( 1RM ) strength were compared between conditions after both a common training period ( 6 weeks ) and number of training sessions ( 18 ) . After 6 weeks and 18 sessions ( 9 weeks for the 2/wk group ) , increments in 1RM strength for the 3/wk and 2/wk groups were similar [ effect size ( ES ) differences ≈0.3 , 3/wk > 2/wk ] , whereas the 2/wk group presented greater isokinetic ( ES differences = 0.3–1.2 ) and isometric ( ES differences ≈0.7 ) strength increases than the 3/wk condition . A significant ( P < 0.05 ) increase in MHC IIa percentage was evident for the 2/wk group after 18 sessions . Both training groups exhibited a trend towards a reduction in the relative MHC IIx and an increase in MHC IIa contents ( ES range = 0.5–1.24 ) . However , correlations between changes in the strength and MHC profiles were weak ( r2 : 0.0–0.5 ) . Thus , isometric and isokinetic strength responses to variations in training frequency differed from 1RM strength responses , and changes in strength were not strongly related to alterations in relative MHC content Background : Strength training has been shown to benefit the health and function of older adults . Objective : To investigate whether one set of exercises performed once a week was as effective in increasing muscle strength as training twice a week . Methods : 18 subjects ( 7 women and 11 men ) aged 65–79 years were r and omly assigned to two groups . Both groups performed one set of exercises to muscular fatigue ; group 1 trained 1 day/week and group 2 trained 2 days/week on three lower and three upper body exercises for 9 weeks . The data were analysed using a mixed model 2 × 2 analysis of variance . Results : A significant main effect of time ( p<0.001 ) , but not group , on one-repetition maximum scores was observed . No significant interaction was observed between time and group and therefore no difference in strength changes between training once a week versus twice a week after 9 weeks . Conclusions : One set of exercises performed once weekly to muscle fatigue improved strength as well as twice a week in the older adult . Our results provide information that will assist in design ing strength-training programmes that are more time and cost efficient in producing health and fitness benefits for older adults Abstract Schoenfeld , BJ , Ratamess , NA , Peterson , MD , Contreras , B , and Tiryaki-Sonmez , G. Influence of resistance training frequency on muscular adaptations in well-trained men . J Strength Cond Res 29(7 ) : 1821–1829 , 2015—The purpose of this study was to investigate the effects of training muscle groups 1 day per week using a split-body routine ( SPLIT ) vs. 3 days per week using a total-body routine ( TOTAL ) on muscular adaptations in well-trained men . Subjects were 20 male volunteers ( height = 1.76 ± 0.05 m ; body mass = 78.0 ± 10.7 kg ; age = 23.5 ± 2.9 years ) recruited from a university population . Participants were pair matched according to baseline strength and then r and omly assigned to 1 of the 2 experimental groups : a SPLIT , where multiple exercises were performed for a specific muscle group in a session with 2–3 muscle groups trained per session ( n = 10 ) or a TOTAL , where 1 exercise was performed per muscle group in a session with all muscle groups trained in each session ( n = 10 ) . Subjects were tested pre- and post study for 1 repetition maximum strength in the bench press and squat , and muscle thickness ( MT ) of forearm flexors , forearm extensors , and vastus lateralis . Results showed significantly greater increases in forearm flexor MT for TOTAL compared with SPLIT . No significant differences were noted in maximal strength measures . The findings suggest a potentially superior hypertrophic benefit to higher weekly resistance training frequencies PURPOSE To compare the effects of the 16-wk training period ( 2 d.wk(-1 ) ) of resistance training alone ( S ) , endurance training alone ( E ) , or combined resistance ( once weekly ) and endurance ( once weekly ) training ( SE ) on muscle mass , maximal strength and power of the leg and arm extensor muscles , and maximal workload ( Wmax ) by using a incremental cycling test in older men . METHODS Thirty-one healthy men ( 65 - 74 yr ) were divided into three treatment groups to train 2x wk(-1 ) for 16 wk : S ( N = 10 ) , E ( N = 11 ) , or SE ( N = 10 ; 1x wk(-1 ) S + 1x wk(-1 ) E ) . The subjects were tested at 8-wk intervals ( i.e. , weeks 8 and 16 ) . RESULTS There were no significant differences between S- and SE-induced muscle hypertrophy ( 11 % and 11 % ) and maximal strength ( 41 % and 38 % ) gains of the legs as well as between E- and SE-induced Wmax ( 28 % and 23 % ) gains . The increase in arm strength in S ( 36 % ) was greater than that recorded in SE ( 22 % ) and greater than that recorded in E ( 0 % ) . CONCLUSIONS Prolonged combined resistance and endurance training in older men seemed to lead to similar gains in muscle mass , maximal strength , and power of the legs as resistance training alone and to similar gains in maximal peak power output measured in an incremental cycling test as endurance training alone . These findings may have an effect on how resistance exercise is prescribed to older adults Purpose To determine if muscle growth is important for increasing muscle strength or if changes in strength can be entirely explained from practicing the strength test . Methods Thirty-eight untrained individuals performed knee extension and chest press exercise for 8 wk . Individuals were r and omly assigned to either a high-volume training group ( HYPER ) or a group just performing the one repetition maximum ( 1RM ) strength test ( TEST ) . The HYPER group performed four sets to volitional failure ( ~8RM–12RM ) , whereas the TEST group performed up to five attempts to lift as much weight as possible one time each visit . Results Data are presented as mean ( 90 % confidence interval ) . The change in muscle size was greater in the HYPER group for both the upper and lower bodies at most but not all sites . The change in 1RM strength for both the upper body ( difference of −1.1 [ −4.8 , 2.4 ] kg ) and lower body ( difference of 1.0 [ −0.7 , 2.8 ] kg for dominant leg ) was not different between groups ( similar for nondominant ) . Changes in isometric and isokinetic torque were not different between groups . The HYPER group observed a greater change in muscular endurance ( difference of 2 [ 1,4 ] repetitions ) only in the dominant leg . There were no differences in the change between groups in upper body endurance . There were between-group differences for exercise volume ( mean [ 95 % confidence interval ] ) of the dominant ( difference of 11,049.3 [ 9254.6–12,844.0 ] kg ) leg ( similar for nondominant ) and chest press with the HYPER group completing significantly more total volume ( difference of 13259.9 [ 9632.0–16,887.8 ] kg ) . Conclusions These findings suggest that neither exercise volume nor the change in muscle size from training contributed to greater strength gains compared with just practicing the test Gentil , P and Bottaro , M. Influence of supervision ratio on muscle adaptations to resistance training in nontrained subjects . J Strength Cond Res 24(3 ) : 639 - 643 , 2010-The purpose of the present study was to compare the changes in muscle strength in nontrained young males performing resistance training under different supervision ratios . One hundred twenty-four young men were r and omly assigned to groups trained under a high ( HS , 1:5 coach to athlete ratio ) or low ( LS , 1:25 ) supervision ratio . Both groups performed identical resistance training programs . Subjects were tested for maximum bench press 1 repetition maximum ( 1RM ) and knee extensor torque before and after 11 weeks of training . According to the results , only HS lead to a significant increase ( 11.8 % ) in knee extensor torque . Both groups significantly increased bench press 1RM load ; the increases were 10.22 % for LS and 15.9 % for HS . The results revealed significant differences between groups for changes in knee extensor torque and 1RM bench press , with higher values for the HS group . There were no differences between groups for the increases in bench press and leg press work volume or training attendance . The proportion of subjects training with maximum intensity was higher in HS for both bench press and leg press exercises . In addition , the distribution of subjects training with maximal intensity was higher for the bench press than for the leg press exercise in both groups . The primary findings of the present study are that the strength gains for both lower- and upper-body muscles are greater in subjects training under higher supervision ratios , and this is probably because of higher exercise intensity . These results confirm the importance of direct supervision during resistance training The aim of this study was to compare the effect of resistance training ( RT ) performed with different frequencies followed by a detraining period on muscular strength and oxidative stress ( OS ) biomarkers in older women . Twenty-seven physically independent women ( 68.8 ± 4.8 years , 69.1 ± 14.3 kg , 156.0 ± 6.5 cm , and 28.3 ± 4.9 to kg.m−2 ) were r and omly assigned to perform a RT program for 2 or 3 days per week ( G2X = 13 vs. G3X = 14 ) for 12 weeks followed by 12 weeks of detraining period . One repetition maximum ( 1RM ) tests were used as measures of muscular strength ( three exercises , three attempts for each exercise , 3–5 min of rest between attempts , and 5 min of rest between exercises ) . Advanced oxidized protein products ( AOPP ) and total radical-trapping antioxidant parameter ( TRAP ) were used as oxidative stress indicators . Both groups increased muscular strength after 12 weeks of training ( P < 0.05 ) in chest press ( G2X = + 11.9 % vs. G3X = + 27.5 % , P < 0.05 ) , knee extension ( G2X = + 18.4 % vs. G3X = + 16.7 % , P > 0.05 ) , and preacher curl ( G2X = + 37.6 % vs. G3X = + 36.7 % , P > 0.05 ) . On the other h and , 12 weeks of detraining were not sufficient to eliminate the major effects produced by RT on muscular strength , although a significant decrease ( P < 0.05 ) has been observed for chest press ( G3X = −9.1 % vs. G2X = −10.2 % , P > 0.05 ) , knee extension ( G2X = −14.9 % vs. G3X = −12.1 % , P > 0.05 ) , and preacher curl ( G2X = −20.5 % vs. G3X = −17.4 % , P > 0.05 ) . Pre- to post-training , both groups showed significant ( P < 0.05 ) increases in TRAP ( G2X = + 6.9 % vs. G3X = + 15.1 % ) with no statistical significant difference between the groups ( P > 0.05 ) , and the scores remained elevated compared to pre-training after 12 weeks of detraining . AOPP was not changed by RT or detraining ( P > 0.05 ) . The results suggest that a 12-week RT program with a frequency of 2 days per week may be sufficient to improve muscular strength and OS in older women and detraining for 12 weeks does not completely reverse the changes induced by RT This study compared the effects of different weekly training frequencies on the cardiovascular and neuromuscular adaptations induced by concurrent training in previously trained elderly . After 20weeks of combined strength and endurance training , twenty-four healthy elderly men ( 65±4 years ) were r and omly placed into two frequency training groups : strength and endurance training performed twice a week ( SE2 , n=12 ) ; or , strength and endurance training performed three times per week ( SE3 , n=12 ) . The interventions lasted 10 weeks and each group performed identical exercise intensity and volume per session . Before and after the exercise training , one maximum repetition test ( 1RM ) , isometric peak torque ( PT ) , maximal surface electromyographic activity ( EMG ) , as well as muscle thickness ( MT ) were examined . Additionally , peak oxygen uptake ( VO(2peak ) ) , maximum aerobic workload ( W(max ) ) , first and second ventilatory thresholds ( VT1 and VT2 ) were evaluated . There were significant increases in upper and lower-body 1RM , MT , VO(2peak ) , VT1 and VT2 , with no differences between groups . There were no changes after training in maximal EMG and isometric peak torque . W(max ) was improved only in SE3 . After 10 weeks of training , twice weekly combined strength and endurance training leads to similar neuromuscular and cardiovascular adaptations as three times per week , demonstrating the efficiency of lower frequency of concurrent training in previously trained elderly men OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity Abstract Ferreira , DV , Ferreira-Júnior , JB , Soares , SRS , Cadore , EL , Izquierdo , M , Brown , LE , and Bottaro , M. Chest press exercises with different stability requirements result in similar muscle damage recovery in resistance trained men . J Strength Cond Res 31(1 ) : 71–79 , 2017—This study investigated the time course of 96 hours of muscle recovery after 3 different chest press exercises with different stability requirements in resistance-trained men . Twenty-seven men ( 23.5 ± 3.8 years ) were r and omly assigned to one of the 3 groups : ( a ) Smith machine bench press ; ( b ) barbell bench press ; or ( c ) dumbbell bench press . Participants performed 8 sets of 10 repetition maximum with 2 minutes rest between sets . Muscle thickness , peak torque ( PT ) , and soreness were measured pre , post , 24 , 48 , 72 , and 96 hours after exercise . There were no differences in the time course of PT or muscle thickness values of the pectoralis major ( p = 0.98 and p = 0.91 , respectively ) or elbow extensors ( p = 0.07 and p = 0.86 , respectively ) between groups . Muscle soreness of the pectoralis major was also not different between groups ( p > 0.05 ) . However , the Smith machine and barbell groups recovered from triceps brachii muscle soreness by 72 hours after exercise ( p > 0.05 ) , whereas the dumbbell group did not present any triceps brachii muscle soreness after exercise ( p > 0.05 ) . In conclusion , resistance-trained men experience similar muscle damage recovery after Smith machine , barbell , and dumbbell chest press exercise . However , muscle soreness of the elbow extensors takes a longer time to recover after using a barbell chest press exercise Abstract This study investigated the effects of resistance training ( RT ) performed with different frequencies , including a follow‐up period , on cardiorespiratory fitness in healthy older individuals . Eighty‐eight men and women ( 69 ± 3 years , 167 ± 9 cm and 78 ± 14 kg ) were r and omly placed into four groups : training one‐ ( M1 = 11 , W1 = 12 ) , two‐ ( M2 = 7 , W2 = 14 ) , or three‐ ( M3 = 11 , W3 = 13 ) times‐per‐week or a non‐training control group ( MCon = 11 , WCon = 9 ) . During months 1–3 , all subjects trained two‐times‐per‐week while during the subsequent 6 months , training frequency was set according to the group . Oxygen consumption ( cycling economy : CE ) , gross efficiency ( GE ) , blood lactate concentrations ( La ) and heart rate ( HR ) were evaluated during a submaximal cycle ergometer test . Hemoglobin ( Hb ) , hematocrit ( Hct ) , heart rate ( HRrest ) and body composition by DXA were also measured at rest . Maximal strength was measured by a 1‐RM leg press test . Most improvements in CE , GE , La and HR occurred in all groups during months 1–3 . No additional statistically significant improvements were observed during months 4–9 , although effect sizes for the change in CE and GE at higher workloads indicated a dose‐response pattern in men ( CE at 75 W : M1 g = 0.13 , M2 g = −0.58 , M3 g = −0.89 ; 100 W : M1 g = 0.43 , M2 g = −0.59 , M3 g = −0.68 ) i.e. higher training frequency ( two‐ and three‐times‐per‐week versus one‐time‐per‐week ) led to greater improvements once the typical plateau in performance had occurred . Hb increased in W1 and W2 , while no changes were observed in Hct or HRrest . 1‐RM increased from months 1–3 in all intervention groups ( except M2 ) and from month 4–9 only in M3 and in all women intervention groups . During follow‐up , maximal strength was maintained but cycling economy returned to the baseline values in all training groups . These data indicate that RT led to significant improvements in cardiorespiratory fitness during the initial 3 months of training . This was partly explained by the RT protocol performed but further improvements may require higher training frequency . These changes are likely to be originated by the improved cardiorespiratory functions rather than neuromuscular adaptations evidence d by a lack of significant relationship during the intervention as well as the divergent results during follow‐up Changes in muscle mass and strength will vary , depending on the volume and frequency of training . The purpose of this study was to determine the effect of short-term equal-volume resistance training with different workout frequency on lean tissue mass and muscle strength . Twenty-nine untrained volunteers ( 27–58 years ; 23 women , 6 men ) were assigned r and omly to 1 of 2 groups : group 1 ( n = 15 ; 12 women , 3 men ) trained 2 times per week and performed 3 sets of 10 repetitions to fatigue for 9 exercises , group 2 ( n = 14 ; 11 women , 3 men ) trained 3 times per week and performed 2 sets of 10 repetitions to fatigue for 9 exercises . Prior to and following training , whole-body lean tissue mass ( dual energy x-ray absorptiometry ) and strength ( 1 repetition maximum squat and bench press ) were measured . Both groups increased lean tissue mass ( 2.2 % ) , squat strength ( 28 % ) , and bench press strength ( 22–30 % ) with training ( p < 0.05 ) , with no other differences . These results suggest that the volume of resistance training may be more important than frequency in developing muscle mass and strength in men and women initiating a resistance training program The purpose of this study was to compare different split resistance training routines on body composition and muscular strength in elite bodybuilders . Ten male bodybuilders ( 26.7 ± 2.7 years , 85.3 ± 10.4 kg ) were r and omly assigned into one of two resistance training groups : 4 and 6 times per week ( G4 × and G6 × , respectively ) , in which the individuals trained for 4 weeks , 4 sets for each exercise performing 6 - 12 repetitions maximum ( RM ) in a pyramid fashion . Body composition was assessed by dual energy X-ray absorptiometry , muscle strength was evaluated by 1RM bench-press testing . The food intake was planned by nutritionists and offered individually throughout the duration of the experiment . Significant increases ( p < .05 ) in fat-free mass ( G4 × = + 4.2 % , G6 × = + 3.5 % ) and muscular strength ( G4 × = + 8.4 % , G6 × = + 11.4 % ) with no group by time interaction were observed . We conclude that 4 and 6 weekly sessions frequencies of resistance training promote similar increases in fat-free mass and muscular strength in elite bodybuilders
11,064
28,975,517
Conclusion Acute exercise performance is increased with hyperoxia . An FiO2 ≥ 0.30 appears to be beneficial for performance , with a higher FiO2 being correlated to greater performance improvement in TTs , TTE , and dynamic muscle function tests . Exercise training and recovery supplemented with hyperoxic gas appears to have a beneficial effect on subsequent exercise performance , but small sample size and wide disparity in experimental protocol s preclude definitive conclusions
Background Acute exercise performance can be limited by arterial hypoxemia , such that hyperoxia may be an ergogenic aid by increasing tissue oxygen availability . Hyperoxia during a single bout of exercise performance has been examined using many test modalities , including time trials ( TTs ) , time to exhaustion ( TTE ) , grade d exercise tests ( GXTs ) , and dynamic muscle function tests . Hyperoxia has also been used as a long-term training stimulus or a recovery intervention between bouts of exercise . However , due to the method ological differences in fraction of inspired oxygen ( FiO2 ) , exercise type , training regime , or recovery protocol s , a firm consensus on the effectiveness of hyperoxia as an ergogenic aid for exercise training or recovery remains unclear . Objectives The aims of this study were to ( 1 ) determine the efficacy of hyperoxia as an ergogenic aid for exercise performance , training stimulus , and recovery before subsequent exercise ; and ( 2 ) determine if a dose – response exists between FiO2 and exercise performance improvements .
Inspiring a hyperoxic ( H ) gas permits subjects to exercise at higher power outputs while training , but there is controversy as to whether this improves skeletal muscle oxidative capacity , maximal O(2 ) consumption ( Vo(2 max ) ) , and endurance performance to a greater extent than training in normoxia ( N ) . To determine whether the higher power output during H training leads to a greater increase in these parameters , nine recreationally active subjects were r and omly assigned in a single-blind fashion to train in H ( 60 % O(2 ) ) or N for 6 wk ( 3 sessions/wk of 10 x 4 min at 90 % Vo(2 max ) ) . Training heart rate ( HR ) was maintained during the study by increasing power output . After at least 6 wk of detraining , a second 6-wk training protocol was completed with the other breathing condition . Vo(2 max ) and cycle time to exhaustion at 90 % of pretraining Vo(2 max ) were tested in room air pre- and posttraining . Muscle biopsies were sample d pre- and posttraining for citrate synthase ( CS ) , beta-hydroxyacyl-coenzyme A dehydrogenase ( beta-HAD ) , and mitochondrial aspartate aminotransferase ( m-AsAT ) activity measurements . Training power outputs were 8 % higher ( 17 W ) in H vs. N. However , both conditions produced similar improvements in Vo(2 max ) ( 11 - 12 % ) ; time to exhaustion ( approximately 100 % ) ; and CS ( H , 30 % ; N , 32 % ) , beta-HAD ( H , 23 % ; N , 21 % ) , and m-AsAT ( H , 21 % ; N , 26 % ) activities . We conclude that the additional training stimulus provided by training in H was not sufficient to produce greater increases in the aerobic capacity of skeletal muscle and whole body Vo(2 max ) and exercise performance compared with training in UNLABELLED Both time-to-exhaustion ( TTE ) and time-trial ( TT ) exercise tests are commonly used to assess exercise performance , but no study has directly examined the reliability of comparable tests in the same subjects . PURPOSE To evaluate the reliability of comparable TTE and TT treadmill running tests of high and moderately high exercise intensity in endurance-trained male distance runners , and to vali date Hinckson and Hopkins TT prediction methods using log-log modeling from TTE results . METHODS After familiarization tests , eight endurance-trained male distance runners performed , in a r and omized , counterbalanced order , eight trials consisting of two 5-km TT and two 1500-m TT , and four TTE tests run at a speed equivalent to the average speed attained during both the 5-km and 1500-m TT distances . RESULTS Typical error of the estimate ( TEE ) expressed as a coefficient of variation for the 5-km TT , 5-km TTE , 1500-m TT , and 1500-m TTE were 2.0 , 15.1 , 3.3 , and 13.2 % , respectively . The st and ard error of the estimate for predicted TT running speed using log-log modeling from TTE results was 0.67 % , and the predicted versus criterion reliability of this method revealed TEE values of 1.6 % and 2.5 % for the prediction of 5-km and 1500-m TT , respectively . CONCLUSION The variability of 5-km and 1500-m TT tests was significantly less than for similar TTE treadmill protocol s. Despite the greater variability of the TTE tests , log-log modeling using the TTE test results reliably predicted actual TT performance The combined effects of hyperventilation and arterial desaturation on cerebral oxygenation ( [ Formula : see text ] ) were determined using near-infrared spectroscopy . Eleven competitive oarsmen were evaluated during a 6-min maximal ergometer row . The study was r and omized in a double-blind fashion with an inspired O2 fraction of 0.21 or 0.30 in a crossover design . During exercise with an inspired O2 fraction of 0.21 , the arterial CO2 pressure ( 35 ± 1 mmHg ; mean ± SE ) and O2 pressure ( 77 ± 2 mmHg ) as well as the hemoglobin saturation ( 91.9 ± 0.7 % ) were reduced ( P < 0.05).[Formula : see text ] was reduced from 80 ± 2 to 63 ± 2 % ( P < 0.05 ) , and the near-infrared spectroscopy-determined concentration changes in deoxy- ( ΔHb ) and oxyhemoglobin ( ΔHbO2 ) of the vastus lateralis muscle increased 22 ± 3 μM and decreased 14 ± 3 μM , respectively ( P < 0.05 ) . Increasing the inspired O2fraction to 0.30 did not affect ventilation ( 174 ± 4 l/min ) , but arterial CO2 pressure ( 37 ± 2 mmHg ) , O2 pressure ( 165 ± 5 mmHg ) , and hemoglobin O2saturation ( 99 ± 0.1 % ) increased ( P < 0.05).[Formula : see text ] remained close to the resting level during exercise ( 79 ± 2 vs. 81 ± 2 % ) , and although the muscle ΔHb ( 18 ± 2 μM ) and ΔHbO2 ( -12 ± 3 μM ) were similar to those established without O2 supplementation , work capacity increased from 389 ± 11 to 413 ± 10 W ( P < 0.05 ) . These results indicate that an elevated inspiratory O2fraction increases exercise performance related to maintained cerebral oxygenation rather than to an effect on the working muscles The aim of this study was to determine whether the decreased muscle and blood lactate during exercise with hyperoxia ( 60 % inspired O2 ) vs. room air is due to decreased muscle glycogenolysis , leading to decreased pyruvate and lactate production and efflux . We measured pyruvate oxidation via PDH , muscle pyruvate and lactate accumulation , and lactate and pyruvate efflux to estimate total pyruvate and lactate production during exercise . We hypothesized that 60 % O2 would decrease muscle glycogenolysis , result ing in decreased pyruvate and lactate contents , leading to decreased muscle pyruvate and lactate release with no change in PDH activity . Seven active male subjects cycled for 40 min at 70 % VO2 peak on two occasions when breathing 21 or 60 % O2 . Arterial and femoral venous blood sample s and blood flow measurements were obtained throughout exercise , and muscle biopsies were taken at rest and after 10 , 20 , and 40 min of exercise . Hyperoxia had no effect on leg O2 delivery , O2 uptake , or RQ during exercise . Muscle glycogenolysis was reduced by 16 % with hyperoxia ( 267 + /- 19 vs. 317 + /- 21 mmol/kg dry wt ) , translating into a significant , 15 % reduction in total pyruvate production over the 40-min exercise period . Decreased pyruvate production during hyperoxia had no effect on PDH activity ( pyruvate oxidation ) but significantly decreased lactate accumulation ( 60 % : 22.6 + /- 6.4 vs. 21 % : 31.3 + /- 8.7 mmol/kg dry wt ) , lactate efflux , and total lactate production over 40 min of cycling . Decreased glycogenolysis in hyperoxia was related to an approximately 44 % lower epinephrine concentration and an attenuated accumulation of potent phosphorylase activators ADPf and AMPf during exercise . Greater phosphorylation potential during hyperoxia was related to a significantly diminished rate of PCr utilization . The tighter metabolic match between pyruvate production and oxidation result ed in a decrease in total lactate production and efflux over 40 min of exercise during hyperoxia Abstract Six male rowers rowed maximally for 2500 m in ergometer tests during normoxia ( fractional concentration of oxygen in inspired air , FIO2 0.209 ) , in hyperoxia ( FIO2 0.622 ) and in hypoxia ( FIO2 0.158 ) in a r and omized single-blind fashion . Oxygen consumption ( V˙O2 ) , force production of strokes as well as integrated electromyographs ( iEMG ) and mean power frequency ( MPF ) from seven muscles were measured in 500-m intervals . The iEMG signals from individual muscles were summed to represent overall electrical activity of these muscles ( sum-iEMG ) . Maximal force of a stroke ( Fmax ) decreased from the 100 % pre-exercise maximal value to 67 ( SD 12)% , 63 ( SD 15)% and 76 ( SD 13)% ( P<0.05 to normoxia , ANOVA ) and impulse to 78 ( SD 4)% , 75 ( SD 14)% and 84 ( SD 7)% ( P<0.05 ) in normoxia , hypoxia and hyperoxia , respectively . A strong correlation between Fmax and V˙O2 was found in normoxia but not in hypoxia and hyperoxia . The mean sum-iEMG tended to be lower ( P<0.05 ) in hypoxia than in normoxia but hyperoxia had no significant effect on it . In general , FIO2 did not affect MPF of individual muscles . In conclusion , it was found that force output during ergometer rowing was impaired during hypoxia and improved during hyperoxia when compared with normoxia . Moreover , the changes in force output were only partly accompanied by changes in muscle electrical activity as sum-iEMG was affected by hypoxic but not by hyperoxic gas . The lack of a significant correlation between Fmax and V˙O2 during hypoxia and hyperoxia may suggest a partial uncoupling of these processes and the existence of other limiting factors in addition to V˙O2 This investigation explored the influence of supplemental oxygen administered during the recovery periods of an interval-based running session on the post-exercise markers of reactive oxygen species ( ROS ) and inflammation . Ten well-trained male endurance athletes completed two sessions of 10 × 3 min running intervals at 85 % of the maximal oxygen consumption velocity ( vVO2peak ) on a motorised treadmill . A 90-s recovery period was given between each interval , during which time the participants were administered either a hyperoxic ( HYP ) ( Fraction of Inspired Oxygen ( FIO2 ) 99.5 % ) or normoxic ( NORM ) ( FIO2 21 % ) gas , in a r and omized , single-blind fashion . Pulse oximetry ( SpO2 ) , heart rate ( HR ) , blood lactate ( BLa ) , perceived exertion ( RPE ) , and perceived recovery ( TQRper ) were recorded during each trial . Venous blood sample s were taken pre-exercise , post-exercise and 1 h post-exercise to measure Interleukin-6 ( IL-6 ) and Isoprostanes ( F2-IsoP ) . The SpO2 was significantly lower than baseline following all interval repetitions in both experimental trials ( p < 0.05 ) . The SpO2 recovery time was significantly quicker in the HYP when compared to the NORM ( p < 0.05 ) , with a trend for improved perceptual recovery . The IL-6 and F2-IsoP were significantly elevated immediately post-exercise , but had significantly decreased by 1 h post-exercise in both trials ( p < 0.05 ) . There were no differences in IL-6 or F2-IsoP levels between trials . Supplemental oxygen provided during the recovery periods of interval based exercise improves the recovery time of SPO2 but has no effect on post-exercise ROS or inflammatory responses It has been shown that peak oxygen uptake ( % MathType!MTEF!2!1!+- % feaafaart1ev1aqatCvAUfeBSn0BKvguHDwzZbqefeKCPfgBGuLBPn % 2BKvginnfarmWu51MyVXgatuuDJXwAK1uy0HwmaeHbfv3ySLgzG0uy % 0Hgip5wzaebbnrfifHhDYfgasaacH8srps0lbbf9q8WrFfeuY-Hhbb % f9v8qqaqFr0xc9pk0xbba9q8WqFfea0-yr0RYxir-Jbba9q8aq0-yq % -He9q8qqQ8frFve9Fve9Ff0dmeaabaqaceGacaGaaeqabaWaaeWaea % aakeaaceWGwbGbaiaaaaa!3A5B ! $ $ { \dot{V}}$$ O2peak ) during leg exercise is enhanced by an increased inspiratory oxygen fraction ( FiO2 ) , indicating that oxygen supply is the limiting factor . Whether oxygen supply is a limiting factor in arm exercise performance is unknown . The purpose of this study , therefore , was to examine the effect of different levels of FiO2 on % MathType!MTEF!2!1!+- % feaafaart1ev1aqatCvAUfeBSn0BKvguHDwzZbqefeKCPfgBGuLBPn % 2BKvginnfarmWu51MyVXgatuuDJXwAK1uy0HwmaeHbfv3ySLgzG0uy % 0Hgip5wzaebbnrfifHhDYfgasaacH8srps0lbbf9q8WrFfeuY-Hhbb % f9v8qqaqFr0xc9pk0xbba9q8WqFfea0-yr0RYxir-Jbba9q8aq0-yq % -He9q8qqQ8frFve9Fve9Ff0dmeaabaqaceGacaGaaeqabaWaaeWaea % aakeaaceWGwbGbaiaaaaa!3A5B ! $ $ { \dot{V}}$$ O2peak during arm exercise in healthy individuals . Nine men successfully performed three incremental arm-cranking exercise tests with FiO215 % , FiO221 % and FiO250 % applied in counterbalanced order . A significant FiO2 dependency was observed for % MathType!MTEF!2!1!+- % feaafaart1ev1aqatCvAUfeBSn0BKvguHDwzZbqefeKCPfgBGuLBPn % 2BKvginnfarmWu51MyVXgatuuDJXwAK1uy0HwmaeHbfv3ySLgzG0uy % 0Hgip5wzaebbnrfifHhDYfgasaacH8srps0lbbf9q8WrFfeuY-Hhbb % f9v8qqaqFr0xc9pk0xbba9q8WqFfea0-yr0RYxir-Jbba9q8aq0-yq % -He9q8qqQ8frFve9Fve9Ff0dmeaabaqaceGacaGaaeqabaWaaeWaea % aakeaaceWGwbGbaiaaaaa!3A5B ! $ $ { \dot{V}}$$ O2peak ( p=0.02 ) and power output ( p=0.03 ) and post hoc tests revealed a significant difference in % MathType!MTEF!2!1!+- % feaafaart1ev1aqatCvAUfeBSn0BKvguHDwzZbqefeKCPfgBGuLBPn % 2BKvginnfarmWu51MyVXgatuuDJXwAK1uy0HwmaeHbfv3ySLgzG0uy % 0Hgip5wzaebbnrfifHhDYfgasaacH8srps0lbbf9q8WrFfeuY-Hhbb % f9v8qqaqFr0xc9pk0xbba9q8WqFfea0-yr0RYxir-Jbba9q8aq0-yq % -He9q8qqQ8frFve9Fve9Ff0dmeaabaqaceGacaGaaeqabaWaaeWaea % aakeaaceWGwbGbaiaaaaa!3A5B ! $ $ { \dot{V}}$$ O2peak between 15 and 50 % FiO2 ( p=0.02 ) , but not between 15 and 21 % FiO2 , and 21 and 50 % FiO2 . The results of this study show that % MathType!MTEF!2!1!+- % feaafaart1ev1aqatCvAUfeBSn0BKvguHDwzZbqefeKCPfgBGuLBPn % 2BKvginnfarmWu51MyVXgatuuDJXwAK1uy0HwmaeHbfv3ySLgzG0uy % 0Hgip5wzaebbnrfifHhDYfgasaacH8srps0lbbf9q8WrFfeuY-Hhbb % f9v8qqaqFr0xc9pk0xbba9q8WqFfea0-yr0RYxir-Jbba9q8aq0-yq % -He9q8qqQ8frFve9Fve9Ff0dmeaabaqaceGacaGaaeqabaWaaeWaea % aakeaaceWGwbGbaiaaaaa!3A5B ! $ $ { \dot{V}}$$ O2peak is enhanced with increasing FiO2 , and suggest that % MathType!MTEF!2!1!+- % feaafaart1ev1aqatCvAUfeBSn0BKvguHDwzZbqefeKCPfgBGuLBPn % 2BKvginnfarmWu51MyVXgatuuDJXwAK1uy0HwmaeHbfv3ySLgzG0uy % 0Hgip5wzaebbnrfifHhDYfgasaacH8srps0lbbf9q8WrFfeuY-Hhbb % f9v8qqaqFr0xc9pk0xbba9q8WqFfea0-yr0RYxir-Jbba9q8aq0-yq % -He9q8qqQ8frFve9Fve9Ff0dmeaabaqaceGacaGaaeqabaWaaeWaea % aakeaaceWGwbGbaiaaaaa!3A5B ! $ $ { \dot{V}}$$ O2peak during arm exercise is limited by oxygen supply rather than by the metabolic machinery within the muscle itself Abstract This study investigated the relationship between aerobic efficiency during cycling exercise and the increase in physical performance with acute hyperoxic exposure ( FiO2 ~31 % ) ( HOX ) and also tested the hypothesis that fat oxidation could be increased by acute hyperoxia . Fourteen males and four females were recruited for two sessions , where they exercised for 2 × 10 min at 100 W to determine efficiency . HOX and normoxia ( NOX ) were administered r and omly on both occasions to account for differences in nitrogen exchange . Thereafter , a progressive ramp test was performed to determine VO2max and maximal power output ( W max ) . After 30 min rest , workload was set to 80 % of maximal power output ( W max ) for a time to exhaustion test ( TTE ) . At 100W gross efficiency was reduced from 19.4 % during NOX to 18.9 % during HOX ( P ≤ 0.0001 ) . HOX increased fat oxidation at 100 W by 52 % from 3.41 kcal min‐1 to 5.17 kcal min‐1 ( P ≤ 0.0001 ) with a corresponding reduction in carbohydrate oxidation . W max increased by 2.4 % from 388.8 ( ±82.1 ) during NOX to 397.8 ( ±83.5 ) during HOX ( P ≤ 0.0001 ) . SaO2 was higher in HOX both at the end of the maximal exercise test and TTE . Subjects with a high level of efficiency in NOX had a larger improvement in Wmax with HOX , in agreement with the hypothesis that an optimum level of efficiency exists that maximizes power production . No association between mitochondrial excess capacity and endurance performance was found ; increases in oxygen supply seemed to increase maximal aerobic power production and maintain/increase endurance capacity at the same relative workload The present study examined the effect of oxygen fraction in inspired air ( FIO2 ) on exercise performance and maximum oxygen consumption ( VO2max ) . Six national level male rowers exercised three 2500-m all-out tests on a Concept II rowing ergometer . Each subject performed one test in normoxia ( FIO2 20.9 % ) , one in simulated hyperoxia ( FIO2 62.2 % ) and one in simulated hypoxia ( FIO2 15.8 % ) in a r and omized single-blind fashion . The mean final rowing time was 2.3 + /- 0.9 % ( P < 0.01 ; 95 % CI 1.4 - 3.2 ) shorter in hyperoxia and 5.3 + /- 1.8 % ( P < 0.01 ; 95 % CI 3.1 - 7.5 ) longer in hypoxia when compared with normoxia . The effect of FIO2 on VO2max exceeded its effect on exercise performance as VO2max was 11.1 + /- 5.7 % greater ( P < 0.01 ; 95 % CI 5.1 - 17.1 ) in hyperoxia and 15.5 + /- 3.2 % smaller in hypoxia ( P < 0.01 ; 95 % CI 12.2 - 19.0 ) than in normoxia . Blood lactate concentration and O2 consumption per power unit ( ml O2.W-1 ) failed to indicate statistically significant differences in anaerobic metabolism between normoxia and the other two conditions . These data suggest that there are other parameters besides those of energy metabolism that affect exercise performance as FIO2 is modified . These possible mechanisms are discussed in this paper This study evaluated the dynamic response of VO2 in 6 healthy men at the onset and end of submaximal step changes in work rate during a pseudor and om binary sequence ( PRBS ) exercise test and during ramp incremental exercise to exhaustion while breathing three different gas mixtures . The fractional concentrations of inspired O2 were 0.14 , 0.21 , and 0.70 for the hypoxic , normoxic , and hyperoxic tests , respectively . Both maximal VO2 and work rate was significantly reduced in hypoxic tests compared to normoxic and hyperoxic tests . Maximal work rate was greater in hyperoxia than in normoxia . Work rate at ventilatory threshold was lower in hypoxia than in normoxia and hyperoxia but above the upper limit of exercise for the submaximal tests . Hypoxia significantly slowed the response of VO2 both at the onset and end of exercise compared to normoxia and hyperoxia . Hypoxia also modified the response to PRBS exercise , and again there was no difference between normoxia and hyperoxia . These data support the concept that VO2 kinetics can be slowed from the normoxic response by a hypoxic gas mixture The effects of hyperoxia on submaximal exercise with the self-contained breathing apparatus ( SCBA ) were studied in 25 males . Each participant completed a grade d exercise test for the determination of ventilatory threshold ( VT ) and then a submaximal practice trial with a normoxic gas mixture . The normoxic ( 20.93 ± 0.22 % O2 ; SUB21 ) and hyperoxic ( 40.18 ± 0.73 % O2 ; SUB40 ) submaximal trials were then administered in a r and om order . All exercise tests were completed on separate days while wearing firefighting gear and the SCBA . Compared with SUB21 , hyperoxia significantly reduced minute ventilation ( [Vdot]E ) , mask pressure ( Pmask ) , heart rate , blood lactate concentration , perceived exertion , and perceived breathing distress . As expected , hemoglobin saturation remained higher ( p<0.05 ) during SUB40 . The reductions in both [Vdot]E and Pmask with hyperoxia imply a reduction in the work of breathing during exercise . Total gas consumption was 10.3 ± 8.1 % lower during SUB40 when compared to SUB21 , another finding that has significant practical implication s for occupational safety Abstract The effect of hyperoxic gas supplementation on the recovery time of oxygen saturation levels ( SaO2 ) , and its effect on perceptual recovery were assessed . Seven national-level kayak athletes completed two laboratory-based ergometer sessions of 6 × 3-min maximal aerobic intervals , with 2 min recovery between repetitions . During each recovery period , athletes either inhaled a hyperoxic gas ( 99.5 ± 0.2 % FIO2 ) or were given no external supplementation ( control ) . Mean power output , stroke rate , heart rate , and ratings of perceived exertion were collected during each interval repetition , and the intensity was matched between trials . During each 2-min recovery period , post-exercise haemoglobin saturation levels were measured via pulse oximetry ( SpO2 ) , and the time taken for the SpO2 to return to pre-exercise values was recorded . Subsequently , a rating of perceived recovery quality was collected . There were no differences in the levels of post-exercise de-saturation between the hyperoxic and control trials ( P < 0.05 ) , although the recovery time of SpO2 was significantly faster in the hyperoxic trial ( P < 0.05 ) . There was no influence of oxygen supplementation on the athletes ' perception of recovery quality . Hyperoxic gas supplementation during the recovery periods between high-intensity intervals substantially improves the recovery time of SpO2 with no likely influence on recovery perception This study aim ed to investigate the involvement of cerebral oxygenation in limitation of maximal exercise . We hypothesized that O2 supplementation improves physical performance in relation to its effect on cerebral oxygenation during exercise . Eight untrained men ( age 27 ± 6 years ; $ $ { \dot{\text{V}}}$$O2max 45 ± 8 ml min−1 kg−1 ) performed two r and omized exhaustive ramp exercises on a cycle ergometer ( 1 W/3 s ) under normoxia and hyperoxia ( FIO2 = 0.3 ) . Cerebral ( ΔCOx ) and muscular ( ΔMOx ) oxygenation responses to exercise were monitored using near-infrared spectroscopy . Power outputs corresponding to maximal exercise intensity , to threshold of ΔCOx decline ( ThCOx ) and to the respiratory compensation point ( RCP ) were determined . Power output ( $ $ { \dot{\text{W}}}$$max = 302 ± 20 vs. 319 ± 28 W ) and arterial O2 saturation estimated by pulse oximetry ( SpO2 = 95.7 ± 0.9 vs. 97.0 ± 0.5 % ) at maximal exercise were increased by hyperoxia ( P < 0.05 ) . However , the ΔMOx response during exercise was not significantly modified with hyperoxia . RCP ( 259 ± 17 vs. 281 ± 25 W ) and ThCOx ( 259 ± 23 vs. 288 ± 30 W ) were , however , improved ( P < 0.05 ) with hyperoxia and the ThCOx shift was related to the $ $ { \dot{\text{W}}}$$max improvement with hyperoxia ( r = 0.71 , P < 0.05 ) . The relationship between the change in cerebral oxygenation response to exercise and the performance improvement with hyperoxia supports that cerebral oxygenation is limiting the exercise performance in healthy young subjects Large increases in systemic oxygen content cause substantial reductions in exercising forearm blood flow ( FBF ) due to increased vascular resistance . We hypothesized that 1 ) functional sympatholysis ( blunting of sympathetic α-adrenergic vasoconstriction ) would be attenuated during hyperoxic exercise and 2 ) α-adrenergic blockade would limit vasoconstriction during hyperoxia and increase FBF to levels observed under normoxic conditions . Nine male subjects ( age 28 ± 1 yr ) performed forearm exercise ( 20 % of maximum ) under normoxic and hyperoxic conditions . Studies were performed in a hyperbaric chamber at 1 atmosphere absolute ( ATA ; sea level ) while breathing 21 % O(2 ) and at 2.82 ATA while breathing 100 % O(2 ) ( estimated change in arterial O(2 ) content ∼6 ml O(2)/100 ml ) . FBF ( ml/min ) was measured using Doppler ultrasound . Forearm vascular conductance ( FVC ) was calculated from FBF and blood pressure ( arterial catheter ) . Vasoconstrictor responsiveness was determined using intra-arterial tyramine . FBF and FVC were substantially lower during hyperoxic exercise than normoxic exercise ( ∼20 - 25 % ; P < 0.01 ) . At rest , vasoconstriction to tyramine ( % decrease from pretyramine values ) did not differ between normoxia and hyperoxia ( P > 0.05 ) . During exercise , vasoconstrictor responsiveness was slightly greater during hyperoxia than normoxia ( -22 ± 3 vs. -17 ± 2 % ; P < 0.05 ) . However , during α-adrenergic blockade , hyperoxic exercise FBF and FVC remained lower than during normoxia ( P < 0.01 ) . Therefore , our data suggest that although the vasoconstrictor responsiveness during hyperoxic exercise was slightly greater , it likely does not explain the majority of the large reductions in FBF and FVC ( ∼20 - 25 % ) during hyperbaric hyperoxic exercise Maximal O2 delivery and O2 uptake ( VO2 ) per 100 g of active muscle mass are far greater during knee extensor ( KE ) than during cycle exercise : 73 and 60 ml . min-1 . 100 g-1 ( 2.4 kg of muscle ) ( R. S. Richardson , D. R. Knight , D. C. Poole , S. S. Kurdak , M. C. Hogan , B. Grassi , and P. D. Wagner . Am . J. Physiol . 268 ( Heart Circ . Physiol . 37 ) : H1453-H1461 , 1995 ) and 28 and 25 ml . min-1 . 100 g-1 ( 7.5 kg of muscle ) ( D. R. Knight , W. Schaffartzik , H. J. Guy , R. Predilleto , M. C. Hogan , and P. D. Wagner . J. Appl . Physiol . 75 : 2586 - 2593 , 1993 ) , respectively . Although this is evidence of muscle O2 supply dependence in itself , it raises the following question : With such high O2 delivery in KE , are the quadriceps still O2 supply dependent at maximal exercise ? To answer this question , seven trained subjects performed maximum KE exercise in hypoxia [ 0.12 inspired O2 fraction ( FIO2 ) ] , normoxia ( 0.21 FIO2 ) , and hyperoxia ( 1.0 FIO2 ) in a balanced order . The protocol ( after warm-up ) was a square wave to a previously determined maximum work rate followed by incremental stages to ensure that a true maximum was achieved under each condition . Direct measures of arterial and venous blood O2 concentration in combination with a thermodilution blood flow technique allowed the determination of O2 delivery and muscle VO2 . Maximal O2 delivery increased with inspired O2 : 1.3 + /- 0.1 , 1.6 + /- 0.2 , and 1.9 + /- 0.2 l/min at 0.12 , 0.21 , and 1.0 FIO2 , respectively ( P < 0.05 ) . Maximal work rate was affected by variations in inspired O2 ( -25 and + 14 % at 0.12 and 1.0 FIO2 , respectively , compared with normoxia , P < 0.05 ) as was maximal VO2 ( VO2 max ) : 1.04 + /- 0.13 , 1 . 24 + /- 0.16 , and 1.45 + /- 0.19 l/min at 0.12 , 0.21 , and 1.0 FIO2 , respectively ( P < 0.05 ) . Calculated mean capillary PO2 also varied with FIO2 ( 28.3 + /- 1.0 , 34.8 + /- 2.0 , and 40.7 + /- 1.9 Torr at 0.12 , 0.21 , and 1.0 FIO2 , respectively , P < 0.05 ) and was proportionally related to changes in VO2 max , supporting our previous finding that a decrease in O2 supply will proportionately decrease muscle VO2 max . As even in the isolated quadriceps ( where normoxic O2 delivery is the highest recorded in humans ) an increase in O2 supply by hyperoxia allows the achievement of a greater VO2 max , we conclude that , in normoxic conditions of isolated KE exercise , KE VO2 max in trained subjects is not limited by mitochondrial metabolic rate but , rather , by O2 supply PURPOSE To determine whether training in a hyperoxic environment would result in greater increases in VO2max and performance at 90 % VO2max as compared with training in normoxia . METHODS In a single blind design nine athletes trained for 6 wk on a cycle ergometer 3 d.wk(-1 ) , 1 h.d(-1 ) ( 10 x 4-min intervals , with 2 min of rest between intervals ) at 90 % HR(max ) . Training HR range was maintained by adjusting the power output . Subjects were r and omly assigned to H ( 60 % O2 ) or N ( 21 % O2 ) breathing conditions for training . After 12 wk of detraining , a second 6-wk training protocol was completed with the breathing conditions reversed . VO2max , performance time at 90 % VO2max and cardiorespiratory response to a steady-state exercise at 80 % VO2max were measured pre- and posttraining . All pre- and posttraining tests were conducted under normoxic conditions . RESULTS There were no significant differences between pretraining results for any of the parameters . Power output was 8.1 % higher while training in H compared with N , to maintain training HR . Both H and N training result ed in increased performance time , with H being greater than N. Although there was a trend for a greater increase in VO2max after H versus N training , this difference was not significant . HR(max ) did not change for H or N. HR VE at 80 % VO2max decreased posttraining with no differences between H and N. CONCLUSION The data showed that a higher power output was required to maintain HR during H training . This increased training intensity during H result ed in improved exercise performance whereas cycling at 90 % VO2max in room air and may be due to peripheral factors because cardiorespiratory responses were similar The purpose of this study was to examine the influence of hyperoxic gas ( 50 % O2 in N2 ) inspiration on pulmonary oxygen uptake ( V(O2 ) ) kinetics during step transitions to moderate , severe and supra-maximal intensity cycle exercise . Seven healthy male subjects completed repeat transitions to moderate ( 90 % of the gas exchange threshold , GET ) , severe ( 70 % of the difference between the GET and V(O2 ) peak ) and supra-maximal ( 105 % V(O2 ) peak ) intensity work rates while breathing either normoxic ( N ) or hyperoxic ( H ) gas before and during exercise . Hyperoxia had no significant effect on the Phase II V(O2 ) time constant during moderate ( N : 28+/-3s versus H : 31+/-7s ) , severe ( N : 32+/-9s versus H : 33+/-6s ) or supra-maximal ( N : 37+/-9s versus H : 37+/-9s ) exercise . Hyperoxia result ed in a 45 % reduction in the amplitude of the V(O2 ) slow component during severe exercise ( N : 0.60+/-0.21 L min(-1 ) versus H : 0.33+/-0.17 L min(-1 ) ; P < 0.05 ) and a 15 % extension of time to exhaustion during supra-maximal exercise ( N : 173+/-28 s versus H : 198+/-41 s ; P < 0.05 ) . These results indicate that the Phase II V(O2 ) kinetics are not normally constrained by ( diffusional ) O2 transport limitations during moderate , severe or supra-maximal intensity exercise in young healthy subjects performing upright cycle exercise Maximum oxygen uptake ( VO2max ) was measured in six college-aged males under normoxic ( NVO2max ) and hyperoxic ( HVO2max ; 70 % oxygen ) conditions . Subjects then r and omly performed the following three 20-min submaximal exercise bouts : 75 % normoxic VO2max under normoxia ( NVO2N ) , 75 % normoxic VO2max under hyperoxia ( NVO2H ) , and 75 % hyperoxic VO2max under hyperoxia ( HVO2H ) . Metabolic parameters were obtained at 5-min intervals . Hyperoxia result ed in a 13 % increase ( P less than 0.01 ) in VO2max ( NVO2max = 3.54 l X min-1 vs HVO2max = 4.00 l X min-1 ) . Significant ( P less than 0.05 ) decreases were observed in VE ( ventilation ) ( 13 % ) , epinephrine ( 37 % ) , norepinephrine ( 26 % ) , and blood lactate ( 28 % ) , with no change in oxygen uptake ( VO2 ) , carbon dioxide production ( VCO2 ) , or respiratory exchange ratio ( R ) during hyperoxia at the same absolute power output ( NVO2N vs NVO2H ) . However , at the same relative power outputs ( NVO2N vs HVO2H ) no significant changes in VE , epinephrine , norepinephrine , or blood lactate were observed when hyperoxia and normoxia were compared Background : The impact of hyperoxia on exercise limitation is still incompletely understood . Objectives : We investigated to which extent breathing hyperoxia enhances the exercise performance of healthy subjects and which physiologic mechanisms are involved . Methods : A total of 32 healthy volunteers ( 43 ± 15 years , 12 women ) performed 4 bicycle exercise tests to exhaustion with ramp and constant-load protocol s ( at 75 % of the maximal workload [ Wmax ] on FiO2 0.21 ) on separate occasions while breathing ambient ( FiO2 0.21 ) or oxygen-enriched air ( FiO2 0.50 ) in a r and om , blinded order . Workload , endurance , gas exchange , pulse oximetry ( SpO2 ) , and cerebral ( CTO ) and quadriceps muscle tissue oxygenation ( QMTO ) were measured . Results : During the final 15 s of ramp exercising with FiO2 0.50 , Wmax ( mean ± SD 270 ± 80 W ) , SpO2 ( 99 ± 1 % ) , and CTO ( 67 ± 9 % ) were higher and the Borg CR10 Scale dyspnea score was lower ( 4.8 ± 2.2 ) than the corresponding values with FiO2 0.21 ( Wmax 257 ± 76 W , SpO2 96 ± 3 % , CTO 61 ± 9 % , and Borg CR10 Scale dyspnea score 5.7 ± 2.6 , p < 0.05 , all comparisons ) . In constant-load exercising with FiO2 0.50 , endurance was longer than with FiO2 0.21 ( 16 min 22 s ± 7 min 39 s vs. 10 min 47 s ± 5 min 58 s ) . With FiO2 0.50 , SpO2 ( 99 ± 0 % ) and QMTO ( 69 ± 8 % ) were higher than the corresponding isotime values to end-exercise with FiO2 0.21 ( SpO2 96 ± 4 % , QMTO 66 ± 9 % ) , while minute ventilation was lower in hyperoxia ( 82 ± 18 vs. 93 ± 23 L/min , p < 0.05 , all comparisons ) . Conclusion : In healthy subjects , hyperoxia increased maximal power output and endurance . It improved arterial , cerebral , and muscle tissue oxygenation , while minute ventilation and dyspnea perception were reduced . The findings suggest that hyperoxia enhanced cycling performance through a more efficient pulmonary gas exchange and a greater availability of oxygen to muscles and the brain ( cerebral motor and sensory neurons ) This study evaluated the effects of hyperoxia ( inspired oxygen fraction=40 % ) on performance during a simulated firefighting work circuit ( SFWC ) consisting of five events . On separate days , 17 subjects completed at least three orientation trials followed by two experimental trials while breathing either normoxic ( NOX ) and hyperoxic ( HOX ) gas mixtures that were r and omly assigned in double-blind , cross-over design . Previously , ventilatory threshold ( Tvent ) and VO2max had been determined during grade d exercise ( GXT ) on a cycle ergometer . Lactate concentration in venous blood was assessed at exactly 5 min after both the experimental trials and after the GXT . Total time to complete the SFWC was decreased by 4 % ( p < 0.05 ) with HOX . No differences were observed in individual event times early in the circuit , however HOX result ed in a 12 % improvement ( p < 0.05 ) on the final event . A significantly decreased rating of perceived exertion ( RPE ) was also recorded immediately prior to the final event . No differences were observed in mean heart rate or post-exercise blood lactate when comparing NOX to HOX . Heart rates during the SFWC ( both conditions ) were higher than HR at Tvent , but lower than HR at VO2max ( p < 0.05 ) . Post-SFWC lactate values were higher ( p < 0.05 ) than post-VO2max . These results demonstrate that hyperoxia provided a small but significant increase in performance during short duration , high intensity simulated firefighting work This study focuses on the effect of hyperoxia on maximal oxygen uptake $ $ { \left ( { \ifmmode\exp and after\dot\else\exp and after\.\fi{V}{\text{O}}_{{2\max } } } \right ) } $ $ and maximal power ( Pmax ) in subjects exhibiting exercise-induced arterial hypoxemia ( EIH ) at sea level . Sixteen competing male cyclists $ $ \ifmmode\exp and after\dot\else\exp and after\.\fi{V}{\text{O}}_{{2\max } } $ $ > 60 ml·min−1·kg−1 ) performed exhaustive ramp exercise ( cycle-ergometer ) under normoxia and moderate hyperoxia ( FIO2=30 % ) . After the normoxic trial , the subjects were divided into those demonstrating EIH during exercise [ arterial O2 desaturation ( Δ SaO2 ) > 5 % ; n=9 ] and those who did not ( n=7 ) . Under hyperoxia , SaO2 raised and the increase was greater for the EIH than for the non-EIH group ( P<0.001 ) . $ $ \ifmmode\exp and after\dot\else\exp and after\.\fi{V}{\text{O}}_{{2\max } } $ $ improved for both groups and to a greater extent for EIH ( 12.8±5.7 % vs. 4.2±4.6 % , P<0.01 ; mean±SD ) and the increase was correlated to the gain in SaO2 for all subjects ( r=0.71 , P<0.01 ) . Pmax improved by 3.3±3.3 % ( P<0.01 ) regardless of the group . These data suggest that pulmonary gas exchange contributes to a limitation in $ $ \ifmmode\exp and after\dot\else\exp and after\.\fi{V}{\text{O}}_{{2\max } } $ $ and power for especially EIH subjects We examined the influence of hyperoxia on peak oxygen uptake ( VO2peak ) and peripheral gas exchange during exercise with the quadriceps femoris muscle . Young , trained men ( n=5 ) and women ( n=3 ) performed single-leg knee-extension exercise at 70 % and 100 % of maximum while inspiring normal air ( NOX ) or 60 % O2 ( HiOX ) . Blood was sample d from the femoral vein of the exercising limb and from the contralateral artery . In comparison with NOX , hyperoxic arterial O2 tension ( PaO2 ) increased from 13.5 + /- 0.3 ( x + /- SE ) to 41.6 + /- 0 . 3 kPa , O2 saturation ( SaO2 ) from 98 + /- 0.1 to 100 + /- 0.1 % , and O2 concentration ( CaO2 ) from 177 + /- 4 to 186 + /- 4 mL L-1 ( all P < 0 . 01 ) . Peak exercise femoral venous PO2 ( PvO2 ) was also higher in HiOX ( 3.68 + /- 0.06 vs. 3.39 + /- 0.7 kPa ; P < 0.05 ) , indicating a higher O2 diffusion driving pressure . HiOX femoral venous O2 saturation averaged 36.8 + /- 2.0 % as opposed to 33.4 + /- 1.5 % in NOX ( P < 0.05 ) and O2 concentration 63 + /- 6 vs. 55 + /- 4 mL L-1 ( P < 0.05 ) . Peak exercise quadriceps blood flow ( Qleg ) , measured by the thermo-dilution technique , was lower in HiOX than in NOX , 6.4 + /- 0 . 5 vs. 7.3 + /- 0.9 L min-1 ( P < 0.05 ) ; mean arterial blood pressure at inguinal height was similar in NOX and HiOX at 144 and 142 mmHg , respectively . O2 delivery to the limb ( Qleq times CaO2 ) was not significantly different in HiOX and NOX . VO2peak of the exercising limb averaged 890 mL min-1 in NOX and 801 mL min-1 in HiOX ( n.s . ) corresponding to 365 and 330 mL min-1 per kg active muscle , respectively . The VO2peak-to-PvO2 ratio was lower ( P < 0.05 ) in HiOX than in NOX suggesting a lower O2 conductance . We conclude that the similar VO2peak values despite higher O2 driving pressure in HiOX indicates a peripheral limitation for VO2peak . This may relate to saturation of the rate of O2 turnover in the mitochondria during exercise with a small muscle group but can also be caused by tissue diffusion limitation related to lower O2 conductance Supplemental oxygen is currently widely utilized in conjunction with athletic competition . To assess the utility of this practice , 12 professional soccer players performed two bouts of exhaustive exercise separated by 5 minutes of rest ( " recovery period " ) . During the recovery period , the subjects breathed either room air or 100 % oxygen , assigned by r and omized , double-blind design . The entire procedure was repeated on each subject using the opposite gas . The administration of enriched oxygen during the recovery period had no effect on plasma lactate levels or on performance during the second period of exercise . The subjects were unable to identify which gas they received . In conclusion , data from this study indicate that using 100 % oxygen applied for short periods offers no advantage on recovery from exhaustive exercise or on subsequent exercise performance We investigated regional changes in cerebral artery velocity during incremental exercise while breathing normoxia ( 21 % O2 ) , hyperoxia ( 100 % O2 ) or hypoxia ( 16 % O2 ) [ n=10 ; r and omized cross over design ] . Middle cerebral and posterior cerebral arterial velocities ( MCAv and PCAv ) were measured continuously using transcranial Doppler ultrasound . At rest , only PCAv was reduced ( -7 % ; P=0.016 ) with hyperoxia . During low-intensity exercise ( 40 % workload maximum [ Wmax ] ) MCAv ( + 17 cms(-1 ) ; + 14cms(-1 ) ) and PCAv ( + 9cms(-1 ) ; + 14 cms(-1 ) ) were increased above baseline with normoxia and hypoxia , respectively ( P<0.05 ) . The absolute increase from rest in MCAv was greater than the increase in PCAv between 40 and 80 % Wmax with normoxia ; this greater increase in MCAv was also evident at 60 % Wmax with hypoxia and hyperoxia . Hyperoxic exercise result ed in larger absolute ( + 19 cms(-1 ) ) and relative ( + 40 % ) increases in PCAv compared with normoxia . Our findings highlight the selective changes in PCAv during hyperoxic incremental exercise The normal rate of blood lactate accumulation during exercise is increased by hypoxia and decreased by hyperoxia . It is not known whether these changes are primarily determined by the lactate release in locomotory muscles or other tissues . Eleven men performed cycle exercise at 20 , 35 , 50 , 92 , and 100 % of maximal power output while breathing 12 , 21 , and 100 % O2 . Leg lactate release was calculated at each stage of exercise as the product of femoral venous blood flow ( thermodilution method ) and femoral arteriovenous difference in blood lactate concentrations . Regression analysis showed that leg lactate release accounted for 90 % of the variability in mean arterial lactate concentration at 20 - 92 % maximal power output . This relationship was described by a regression line with a slope of 0.28 + /- 0.02 min/l and a y-intercept of 1.06 + /- 0.38 mmol/l ( r2 = 0.90 ) . There was no effect of inspired O2 concentration on this relationship ( P > 0.05 ) . We conclude that during continuous incremental exercise to fatigue the effect of inspired O2 concentration on blood lactate accumulation is principally determined by the rate of net lactate release in blood vessels of the locomotory muscles
11,065
29,016,967
Discussion Automated algorithms can facilitate population -level surveillance . However , there is no true gold st and ard for determining nonmedical opioid use . Few algorithms have been applied in real-world setting s. Conclusion Automated algorithms may facilitate identification of patients and /or providers most likely to need more intensive screening and /or intervention for nonmedical opioid use .
Objective Improved methods to identify nonmedical opioid use can help direct health care re sources to individuals who need them . Automated algorithms that use large data bases of electronic health care cl aims or records for surveillance are a potential means to achieve this goal . In this systematic review , we review ed the utility , attempts at validation , and application of such algorithms to detect nonmedical opioid use .
OBJECTIVE To provide clinicians with a brief screening tool to predict accurately which individuals may develop aberrant behaviors when prescribed opioids for chronic pain . DESIGN One hundred and eighty-five consecutive new patients treated in one pain clinic took the self-administered Opioid Risk Tool ( ORT ) . The ORT measured the following risk factors associated in scientific literature with substance abuse : personal and family history of substance abuse ; age ; history of preadolescent sexual abuse ; and certain psychological diseases . Patients received scores of 0 - 3 ( low risk ) , 4 - 7 ( moderate risk ) , or > or= 8 ( high risk ) , indicating the probability of their displaying opioid-related aberrant behaviors . All patients were monitored for aberrant behaviors for 12 months after their initial visits . RESULTS For those patients with a risk category of low , 17 out of 18 ( 94.4 % ) did not display an aberrant behavior . For those patients with a risk category of high , 40 out of 44 ( 90.9 % ) did display an aberrant behavior . The authors used the c statistic to vali date the ORT , because it simultaneously assesses sensitivity and specificity . The ORT displayed excellent discrimination for both the male ( c = 0.82 ) and the female ( c = 0.85 ) prognostic models . CONCLUSION In a preliminary study , among patients prescribed opioids for chronic pain , the ORT exhibited a high degree of sensitivity and specificity for determining which individuals are at risk for opioid-related , aberrant behaviors . Further studies in a variety of pain and nonpain setting s are needed to determine the ORT 's universal applicability Assessing for the presence of addiction in the chronic pain patient receiving chronic opioid analgesia is a challenging clinical task . This paper presents a recently developed screening tool for addictive disease in chronic pain patients , and pilot efficacy data describing its ability to do so . In a small sample of patients ( n = 52 ) referred from a multidisciplinary pain center for " problematic " medication use , responses to the screening question naire were compared between patients who met combined diagnostic criteria for a substance use disorder and those who did not , as assessed by a trained addiction medicine specialist . Responses of addicted patients significantly differed from those of nonaddicted patients on multiple screening items , with the two groups easily differentiated by total question naire score . Further , three key screening indicators were identified as excellent predictors for the presence of addictive disease in this sample of chronic pain patients The Prescription Drug Use Question naire ( PDUQ ) is one of several published tools developed to help clinicians better identify the presence of opioid abuse or dependence in patients with chronic pain . This paper introduces a patient version of the PDUQ ( PDUQp ) , a 31-item question naire derived from the items of the original tool design ed for self-administration , and describes evidence for its validity and reliability in a sample of patients with chronic nonmalignant pain and on opioid therapy . Further , this study examines instances of discontinuation from opioid medication treatment related to violation of the medication agreement in this population , and the relationship of these with problematic opioid misuse behaviors , PDUQ and PDUQp scores . A sample of 135 consecutive patients with chronic nonmalignant pain was recruited from a multidisciplinary Veterans Affairs chronic pain clinic , and prospect ively followed over one year of opioid therapy . Using the PDUQ as a criterion measure , moderate to good concurrent and predictive validity data for the PDUQp are presented , as well as item-by-item comparison of the two formats . Reliability data indicate moderate test stability over time . Of those patients whose opioid treatment was discontinued due to medication agreement violation-related discontinuation ( MAVRD ) ( n=38 or 28 % of sample ) , 40 % of these ( n=11 ) were due to specific problematic opioid misuse behaviors . Based upon specificity and sensitivity analyses , a suggested cutoff PDUQp score for predicting MAVRD is provided . This study supports the PDUQp as a useful tool for assessing and predicting problematic opioid medication use in a chronic pain patient sample BACKGROUND : We recently described a method to identify drug diversion in the operating room ( OR ) from automated drug dispensing carts by anesthesia care providers , based on a retrospective outlier analysis of atypical transactions . Such transactions included those occurring on patients after their exit from the OR and on patients whose drugs were not dispensed at the location where the case was performed . In this report , we demonstrate prospect ively the utility of our methodology to detect diversion by unsuspected individuals . METHODS : Each month , all transactions involving scheduled drugs by anesthesia care providers are downloaded from the pharmacy data base and matched to case records from the anesthesia information management system . The frequency of atypical transactions is determined for each provider , normalized by the number of days they worked in the OR . For individuals who are > 2 SDs above the mean for the month for any of the screening queries , a manual examination of their drug transaction logs is performed . Anesthesia records for such providers are examined manually to help determine the likelihood that diversion is taking place , and evidence of escalating activity is considered . Actions taken depend on an assessment of the strength of the evidence that diversion has been occurring . RESULTS : Two unsuspected individuals were identified prospect ively as diverting drugs . Two individuals identified as abusing drugs recreationally outside the workplace showed no evidence of drug diversion through examination of the screening reports and transaction logs , and their rehabilitation treatment teams concurred that there was a very low probability of diversion . A final individual who demonstrated suspicious activity by the screening process was determined to have been careless in documentation practice s , rather than diverting . CONCLUSIONS : The drug diversion screening methodology previously developed is valid for the prospect i ve detection of unsuspected individuals diverting drugs from the OR . The system also provides material useful in the evaluation of possible diversion by anesthesia providers determined to be abusing drugs outside the workplace
11,066
21,800,225
However , less postoperative dysphagia was observed after partial fundoplication and laparoscopic ARS was associated with less pain medication and a shorter hospital stay . Complications of ARS varied from minimal postoperative complications to severe dysphagia and gas bloating . Conclusion ARS in children shows a good overall success rate ( median 86 % ) in terms of complete relief of symptoms . Efficacy of ARS in neurologically impaired children may be similar to normally developed children . The outcome of ARS does not seem to be influenced by different surgical techniques , although postoperative dysphagia may occur less after partial fundoplication .
Background Antireflux surgery ( ARS ) for gastroesophageal reflux disease ( GERD ) is one of the most frequently performed major operations in children . Many studies have described the results of ARS in children , however , with a wide difference in outcome . This study aims to systematic ally review the efficacy of pediatric ARS and its effects on gastroesophageal function , as measured by gastroesophageal function tests . This is the first systematic review comprising only prospect i ve , longitudinal studies , minimizing the risk of bias .
BACKGROUND / PURPOSE It is reported that the main mechanism responsible for gastroesophageal reflux ( GER ) is transient lower esophageal sphincter ( LES ) relaxation in children . However , the effect of Nissen fundoplication on transient LES relaxation has not been investigated in children . This study examined the effect of Nissen fundoplication on motor patterns of the LES in children with pathological GER . METHODS Esophageal manometry and pH were recorded concurrently for 2 hours after administration of apple juice ( 10 mL/kg ) . In seven children documented to have pathological GER by prolonged esophageal pH monitoring ( % time pH less than 4.0>5.0 ) , studies were performed preoperatively and 1 to 3 months after surgery . RESULTS Nissen fundoplication virtually eliminated reflux in all patients . Percentage of time pH was less than 4.0 reduced from 15+/-9 to 0+/-0 . Basal LES pressure did not change significantly ( pre , 21+/-10 mm Hg v post , 27+/-9 mm Hg ) . The number of transient LES relaxation reduced significantly from 13+/-4 to 7+/-7 , and the mean nadir LES pressures during swallow-induced LES relaxation and transient LES relaxation increased significantly from 1+/-1 mm Hg to 13+/-5 mm Hg and from 0+/-0 mm Hg to 11+/-7 mm Hg , respectively . CONCLUSIONS Our findings suggest the antireflux effects of Nissen fundoplication may be based on changes of LES motor patterns that result in incomplete LES relaxation and reduction of the number of transient LES relaxation OBJECTIVES To study the effect of Nissen fundoplication and gastrostomy in severely neurologically impaired children . DESIGN Prospect i ve observational study . SETTING Developmental Disabilities Unit of a regional medical centre in Hong Kong . PATIENTS Children with severe neurological impairment and gastroesophageal reflux who were institutionalised between 1999 and 2004 inclusive . MAIN OUTCOME MEASURES Incidence of vomiting , gastro-intestinal bleeding , and pneumonia in the baseline year and consecutive years following surgery ; 24-hour oesophageal pH monitoring ; recurrence rate ( determined by 24-hour oesophageal monitoring ) ; body weight ; complications of surgery ; and mortality . RESULTS Twenty children , with a mean age at surgery of 8.5 ( st and ard deviation , 3.5 ) years , were recruited . Nissen fundoplication was performed in nine children and 11 children underwent laparoscopic fundoplication . Children were monitored for 1.3 to 5.7 years ( median , 3.5 years ) after surgery . The incidence of vomiting and gastro-intestinal bleeding was significantly decreased following surgery ( P < 0.001 and P = 0.001 , respectively ; Friedman 's test ) . There was no difference between the preoperative and postoperative incidence of pneumonia ( P = 0.973 , Friedman 's test ) . The median reflux index was reduced from 5.7 % to 0.15 % after surgery but six ( 30 % ) patients had recurrent gastroesophageal reflux . The mean body weight was 17.4 kg ( st and ard deviation , 4.7 kg ) at baseline and 22.8 kg ( st and ard deviation , 4.4 kg ) at the end of follow-up ( P < 0.05 , Student 's t test ) . One patient had mild dumping syndrome soon after fundoplication . One patient had one episode of intestinal obstruction . Four patients died 1.9 to 5.0 years following surgery due to respiratory disease . CONCLUSION Our results indicate that in severely neurologically impaired children with gastroesophageal reflux , vomiting , gastro-intestinal bleeding , and reflux indices based on 24-hour oesophageal pH monitoring were significantly reduced following fundoplication and gastrostomy . The incidence of pneumonia was unchanged . The recurrence rate of reflux was 30 % and mortality rate was 20 % Background : Skepticism is still present today about the laparoscopic treatment of gastro-esophageal reflux ( GER ) in children . We present the prospect i ve experience and short-term results of eight Italian pediatric surgical units . Methods : We included all the children with complicated GER , operated after January 1998 by single surgeons from eight different centers . Diagnostic aspects , type of fundoplication , and complications were considered . All the patients were followed for a minimum period of 6 months in order to detect complications or recurrences . Results : 288 children were prospect ively included . Mean age was 4.8 years ( 3 m–14 y ) . Nissen fundoplication was done in 25 % , floppy Nissen in 63 % , Toupet in 1.7 % , and anterior procedures ( Lortat Jacob , Thal ) in 10 % . Gastrostomy was associated , if neurological impairment or feeding disorders were present . Mean follow-up was 15 months and reoperation was necessary in 3.8 % of cases . Conclusions : This experience underlines that minimal invasive access surgery in children is safe and that the laparoscopic approach is considered in eight centers the golden st and ard for surgical repair of gastro-esophageal reflux disease maintaining the same indications and techniques of the open approach Laparoscopic antireflux surgery has been performed in neurologically impaired and scoliotic children . We aim ed to assess the effectiveness of laparoscopic fundoplication in mentally normal children with gastroesophageal reflux disease that failed to respond to medical therapy . Data were prospect ively collected ( symptoms , medical therapy , endoscopies ' findings ) on 12 children ( nine boys , three girls ) aged 9 - 15 years with gastroesophageal reflux disease . Pre- and postoperative ambulatory 24-h pH and DeMeester and Johnson scores were also recorded . Effectiveness of surgery was assessed by comparison of pre- and postoperative total acid exposure time , Visick grade , need for antireflux medication and symptom scores . In total , 11 children underwent a laparoscopic Nissen fundoplication and one underwent a Toupet procedure . Median length of stay was 2 ( 2 - 3 ) nights . The median preoperative pH acid exposure time ( AET ) was 4.7 ( 0.8 - 16.4 ) percent compared with postoperative AET of 0.4 ( 0 - 3 ) percent . Early postoperative dysphagia occurred in four out of 12 patients , requiring a total of six dilatations . Postoperative Visick scores were : grade I=7 and grade II=5 . Laparoscopic fundoplication can be safely performed and is effective in children with GERD who have failed to respond to medical therapy Aim : We investigated the impact of laparoscopic anterior hemifundoplication on gastric emptying ( GE ) and specific symptoms in children with and children without neurodevelopmental delays gastroesophageal reflux . Scintigraphic and ultrasonographic GE measurements were correlated . Patients and Methods : Twenty-six children ( mean age 7 ± 6.1 years ) , of whom 14 were neurodevelopmentally delayed , were evaluated prospect ively before 3 and 6 months after laparoscopic anterior hemifundoplication . All of the patients underwent clinical assessment s , interviews , and 24-hour pH monitoring . Key symptoms were evaluated using a 5-point Likert scale . Gastric emptying was assessed by Tc-99m-DTPA-scintigraphy and ultrasonography . Results : All of the children had significant catch-up growth after fundoplication , which was more pronounced in the neurologically normal children ( P < 0.05 vs impaired ) , in line with a decrease in the use of omeprazol ( mean 0.93 ± 0.7 mg · kg−1 · day−1 before and 0.06 ± 0.18 mg · kg−1 · day−1 at 6 months after operation ; P < 0.001 ) . The 24-hour pH monitoring normalized in all of the children , and the mean severity of the key symptoms such as vomiting , choking , and pain was significantly reduced ( P < 0.001 ) . Scintigraphic GE parameters , such as the elimination rate/minute , gastric half-emptying time ( t1/2 ) , gastric residual activity ( RA ) , and duration of the initial merging time , were not altered significantly by the operation ( P > 0.05 ) . Ultrasonographic evaluations confirmed these results [ positive correlation with scintigraphy for t1/2 ( P = 0.006 ) and RA ( P = 0.01 ) ] . The symptoms evolution and GE were uncorrelated ( P > 0.01 ) . There were no significant differences between children with and children without neurodevelopmental delays . Conclusions : Laparoscopic anterior hemifundoplication achieves an excellent symptomatic outcome without affecting GE in children with and children without neurodevelopmental delays Objectives : Bloating , abdominal pain , and early satiety have been reported in up to 30 % of patients after Nissen fundoplication . We hypothesized that these postsurgical complications in children and young adults are linked to either the effects of surgery on gastric sensation , compliance or motor function or to preexisting physiological abnormalities . Methods : We prospect ively evaluated the effect of Nissen fundoplication on gastric sensory and motor functions in 13 children with gastroesophageal reflux . Gastric barostat and mixed meal gastric emptying studies were performed before surgery in all patients and were repeated after surgery in 8 and 9 children , respectively . Results : Thirteen patients ( median age , 7 years ; range , 6 months to 18 years ) underwent open Nissen ( n = 6 ) or laparoscopic Nissen fundoplication ( n = 7 ) . After fundoplication , patients had significantly higher minimal distending pressure values ( 10 mm Hg vs 3 mm Hg pre-Nissen , respectively ; P < 0.001 ) , reduced gastric compliance ( slope values of 8.39 mm Hg vs 9.15 mm Hg , respectively , P < 0.001 ) and significantly higher pain scores ( P < 0.001 ) . Presurgery and postsurgery gastric emptying at 60 , 90 and 120 minutes after feeding showed no significant changes . Conclusions : After Nissen fundoplication , children with gastroesophageal reflux manifest the following : ( 1 ) reduction in gastric compliance , ( 2 ) increase in minimal gastric distending pressure , ( 3 ) exacerbation of the sensations discomfort with gastric distension and ( 4 ) no effect on gastric emptying Background The laparoscopic approach has become increasingly popular for fundoplication over the last few years ; however many surgeons are skeptical about its real advantages . Methods We conducted a prospect i ve comparative study of children operated on for gastroesophageal reflux ( GER ) . Exclusion criteria included age < 1 year and > 14 years , previous surgery on the esophagus or stomach , and neurologic impairment . We compared two groups of patients who met the same inclusion /exclusion criteria . One group was treated via a laparotomic approach between January 1993 and December 1997 ; the other was treated via a laparoscopic approach between September 1998 and December 2000 . A 360 ° wrap was performed in each group . Results Group 1 ( laparotomic approach ) included 17 patients ; mean operative time was 100 min and postoperative time was 7 days . Group 2 comprised 49 children operated on via a laparoscopic approach ; mean operative time was 78 min and postoperative time was 48 hours . No major complications were encountered in either group . In postoperative period , two patients in group 1 had complications . One had a prolonged bout of gastroplegia , which required nasogastric drainage , and then recovered spontaneously after 20 days ; the other had stenosis of the wrap , which required dilation . No relapses occurred during a follow-up of 6 months . Longterm follow-up data are not presented . Comparative analysis of the short-term functional results indicated that there were no differences between the two groups . Conclusion This study confirms that the minimally invasive approach is safe and effective for the treatment of primary gastroesophageal reflux disease in children CONTEXT AND OBJECTIVE Association between neurological lesions and gastroesophageal reflux disease ( GERD ) in children is very common . When surgical treatment is indicated , the consensus favors the fundoplication technique recommended by Nissen , despite its high morbidity and relapse rates . Vertical gastric plication is a procedure that may have advantages over Nissen fundoplication , since it is less aggressive and more adequately meets anatomical principles . The authors proposed to compare the results from the Nissen and vertical gastric plication techniques . DESIGN AND SETTING R and omized prospect i ve study within the Postgraduate Surgery and Experimentation Program of UNIFESP-EPM , at Hospital do Servidor Público Estadual ( IAMSPE ) and Hospital Municipal Infantil Menino Jesus . METHODS Fourteen consecutive children with cerebral palsy attended between November 2003 and July 2004 were r and omized into two groups for surgical treatment of GERD : NF , Nissen fundoplication ( n = 7 ) ; and VGP , vertical gastric plication ( n = 7 ) . These were clinical ly assessed by scoring for signs and symptoms , evaluation of esophageal pH measurements , duration of the operation , intra and postoperative complications , mortality and length of hospital stay . RESULTS The mean follow-up was 5.2 months ; symptoms were reduced by 42.8 % ( NF ) ( p = 0.001 ) and 57.1 % ( VGP ) ( p = 0.006 ) . The Boix-Ochoa score was favorable for both groups : NF ( p < 0.001 ) and VGP ( p < 0.042 ) . The overall mortality was 14.28 % in both groups and was due to causes unrelated to the surgical treatment . CONCLUSION The two operative procedures were shown to be efficient and efficacious for the treatment of GERD in neuropathic patients , over the study period HYPOTHESIS Gastroesophageal reflux ( GER ) is a common condition in childhood that frequently requires operative treatment . The 360 degrees Nissen fundoplication ( NF ) has been the st and ard operation for GER , but is associated with substantial rates of recurrence , " gas bloat , " gagging , and dysphagia . I believe that the Toupet fundoplication ( TF ) , a 270 degrees posterior wrap originally described in conjunction with myotomy for achalasia , has fewer complications , and its longterm outcome in children compared with NF is favorable . DESIGN Nonr and omized controlled trial . SETTING Tertiary care children 's hospital . PATIENTS Two hundred fifty-six children ( aged 3 months to 16 years ) with GER disease unresponsive to nonoperative therapy who underwent either NF ( n = 102 ) or TF ( n = 154 ) . INTERVENTION Operative repair of GER disease by either NF or TF . MAIN OUTCOME MEASURES Time to first feeding , time to discharge from the hospital , postoperative dysphagia complications , recurrence , and rehospitalization and reoperation rates for each fundoplication technique . RESULTS The 2 fundoplication techniques had equivalent recurrence rates , but TF had significantly lower rates of postoperative dysphagia ( P = .008 ) and rehospitalization/reoperation rates ( P = .005 ) and significantly shorter times to discharge from the hospital ( P = .01 ) and to the first feeding ( P = .02 ) . CONCLUSIONS These data show that both NF and TF are effective procedures for GER in children , with acceptable recovery times and low recurrence rates . However , TF results in earlier feeding and discharge from the hospital and has a significantly lower incidence of dysphagia , gagging , and gas bloat , result ing in fewer rehospitalizations . In this population , TF seems to be superior to NF BACKGROUND Children with neurological impairment ( NI ) commonly have gastroesophageal reflux disease ( GERD ) treated with a fundoplication . The impact of this procedure on quality of life is poorly understood . OBJECTIVES To examine the quality of life of children with NI who have received a fundoplication for GERD and of their caregivers . METHODS The study was a prospect i ve cohort study of children with NI and GERD who underwent a fundoplication at a children 's hospital between January 1 , 2005 , and July 7 , 2006 . Quality of life of the children was assessed with the Child Health Question naire ( CHQ ) and of the caregivers with the Short-Form Health Survey Status ( SF-36 ) and Parenting Stress Index ( PSI ) , both at baseline and 1 month after fundoplication . Functional status was assessed using the WeeFIM . Repeated- measures analyses were performed . RESULTS Forty-four of the 63 parents ( 70 % ) were enrolled . The median WeeFIM score was 31.2 versus the age-normal score of 83 ( P = .001 ) . Compared with the baseline scores , mean CHQ scores improved over 1 month in the domains of bodily pain ( 32.8 vs. 47.5 , P = .01 ) , role limitations -physical ( 30.6 vs. 56.6 , P = .01 ) , mental health ( 62.7 vs. 70.6 , P = .01 ) , family limitation of activities ( 43.3 vs. 55.1 , P = .03 ) , and parental time ( 43.0 vs. 55.3 , P = .03 ) . The parental SF-36 domain of vitality improved from baseline over 1 month ( 41.3 vs. 48.2 , P = .001 ) , but there were no changes from baseline in Parenting Stress scores . CONCLUSIONS Parents reported that the quality of life of children with NI who receive a fundoplication for GERD was improved from baseline in several domains 1 month after surgery . The quality of life and stress of caregivers did not improve in nearly all domains , at least in the short term Background This prospect i ve study investigated the therapy-induced changes in the quality of life ( QoL ) experienced by neurologically healthy and neurodevelopmentally delayed children and their parents after laparoscopic anterior 270 ° fundoplication ( LAF ) . Methods In this study , 40 patients ( 21 impaired ) with a mean age of 7.8 years underwent LAF for gastroesophageal reflux disease ( GERD ) and were evaluated before surgery and then 3 and 6 months afterward using the Gastrointestinal Quality -of-Life Index ( GIQLI ) supplemented by conventional symptom markers . Results Growth , proton pump inhibitor use , and frequency of supraesophageal/respiratory symptoms improved significantly ( p < 0.001 ) as did feeding parameters ( p < 0.05 ) . The global GIQLI score improved by 49 ± 21 % ( p < 0.001 ) . The greatest improvement occurred in the symptoms domain ( p < 0.001 ) . However , positive alterations also were found in the dimensions of emotions ( 58 % ) , social functions ( 37 % ) and physical functions ( 27 % ) ( p < 0.001 ) . Comparison of the overall benefit did not show any differences between the subgroups of neurologically fit and impaired children . However , for the child-centered symptoms domain , the benefit increased stepwise with the degree of impairment . This was counterbalanced by an inverse relationship for the parent-centered emotions domain ( p < 0.05 ) . Conclusions Besides the known improvement in symptoms , LAF achieves a significant improvement in QoL for children and their parents . There is no overall difference in the benefit experienced by neurologically impaired and healthy children Abstract Background : The performance of laparoscopic antireflux surgery is steadily increasing among pediatric surgeons . Different techniques are being used . However , due to a lack of st and ardized follow-up methods , postoperative results are difficult to compare . In this study , we describe the results of postoperative 24-h pH study as an objective criterion for evaluating the results of laparoscopic Thal antireflux surgery . Methods : In a prospect i ve study , 53 patients underwent a laparoscopic Thal procedure . Preoperatively , all patients were subjected to 24-h pH monitoring , an upper GI series , and esophagogastroscopy . pH monitoring was performed 3 months postoperatively to evaluate the effect of the fundoplication . Esophagogastroscopy was repeated in case of preoperative esophagitis . Results : In one patient , the laparoscopy was converted to an open procedure . Feeding was commenced on day 1 in 49 of the 53 children . Mean hospitalization time was 4.4 days . One patient was reoperated for a too-tight fundoplication , and two patients died of unrelated causes . Ultimately , 44 of 50 children ( 88 % ) were free of symptoms ; however , 11 of 41 children ( 25 % ) still displayed pathological reflux on pH monitoring . Conclusions : The Thal fundoplication can be performed laparoscopically in children . Children have a quick recovery , and hospitalization is short ( 4.4 days ) . At follow-up , nearly 90 % of the children are free of symptoms . However , 25 % still have pathological reflux as measured with pH monitoring . Therefore , question naires alone are not a sufficient means of measuring outcome postoperative . pH monitoring is a valuable additional tool for the objective postoperative evaluation of the results of ( laparoscopic ) antireflux procedures The use of the laparoscopic approach to perform antireflux procedures has increased dramatically since its introduction in 1991 . To date , no prospect i ve r and omized studies comparing open surgery to the minimal invasive approach in children have been reported . Many retrospective review s and case series have demonstrated that laparoscopic antireflux procedures are safe and effective once the learning curve is achieved . This position paper is coauthored by the New Technology Committee of the American Pediatric Surgery Association . The goal is to discuss the ongoing controversies and summarize the available evidence to identify the risks and benefits of laparoscopic antireflux procedures Two gastroesophageal reflux disease ( GERD ) symptom question naires were developed and tested prospect ively in a pilot study conducted in infants ( 1 through 11 months ) and young children ( 1 through 4 years ) with and without a clinical diagnosis of GERD . A pediatric gastroenterologist made the clinical diagnosis of GERD . Parents or guardians at 4 study sites completed the question naires , providing information on the frequency and severity of symptoms appropriate to the 2 age cohorts . In infants , symptoms assessed were back arching , choking or gagging , hiccups , irritability , refusal to feed and vomiting or regurgitation . In young children , symptoms assessed were abdominal pain , burping or belching , choking when eating , difficulty swallowing , refusal to eat and vomiting or regurgitation . Respondents were asked to describe additional symptoms . Symptom frequency was the number of occurrences of each symptom in the 7 days before completion of the question naire . Symptom severity was rated from 1 ( not at all severe ) to 7 ( most severe ) . An individual symptom score was calculated as the product of symptom frequency and severity scores . The composite symptom score was the sum of the individual symptom scores . The mean composite symptom and individual symptom scores were higher in infants ( P < 0.001 and P < 0.05 , respectively ) and young children ( P < 0.001 and P < 0.05 , respectively ) with GERD than controls . Vomiting/regurgitation was particularly prevalent in infants with GERD ( 90 % ) . Both groups with GERD were more likely to experience greater severity of symptoms . We found the GERD Symptom Question naire useful in distinguishing infants and young children with symptomatic GERD from healthy children BACKGROUND / PURPOSE Surgery is indicated for the treatment of gastroesophageal reflux disease ( GERD ) when medical treatment fails or complications are encountered in children . However , it has not been fully established how the results after surgery can be evaluated . A prospect i ve study was performed to evaluate the results of surgical therapy for GERD by pH monitoring ( PM ) and esophageal manometry ( EM ) in children . METHODS Patients who were c and i date s for anti-reflux surgery between 2003 and 2004 were evaluated for symptoms , growth and results of PM and EM both in the pre- and postoperative periods . RESULTS Thirteen patients were included ( mean age = 6.65 + /- 3.28 years , male/female ratio = 10/3 ) . Most frequently occurring symptoms were recurrent respiratory infections ( RRI ) ( n = 11 ) and vomiting ( n = 8) . Nissen fundoplication was performed because of unresponsiveness to treatment ( n = 10 ) , RRI ( n = 9 ) , failure to thrive ( n = 7 ) and esophagitis ( n = 2 ) after medical treatment ( 2 - 36 months ) . Symptoms were resolved in 83.9 % of patients and were not changed in 16.1 % following surgery . Weight percentiles had significantly improved ( pre : 12.38 , post : 25.4 , p < 0.05 ) during a short follow-up period ( 1 - 4 months ) . Mean reflux index ( pre : 24.73 + /- 21.07 % , post : 0.93 % , min : 0 - max : 3.6 , p < 0.05 ) , reflux time ( pre : 368 + /- 313 min , post : 17.1 + /- 15.9 min , p < 0.05 ) , number of episodes ( pre : 344.7 + /- 343.6 , post : 19.53 + /- 11.13 , p < 0.05 ) and number of reflux episodes longer than 5 minutes ( pre : 4.3 , min : 0 - max : 58 , post : 0.61 , min : 0 - max : 3 , p < 0.05 ) were found to be reduced after surgery by PM . Lower esophageal sphincter pressure ( pre : 55 + /- 27 cmH (2)O , post : 64.46 + /- 30.85 cmH (2)O ) , contraction amplitude ( pre : 141.92 + /- 69.11 cmH (2)O , post : 130.69 + /- 45 cmH (2)O ) and contraction velocity ( pre : 1.94 cm/s , min : 0.1 - max : 7.5 , post : 4.29 cm/s , min : 0.2 - max : 10 ) did not differ postoperatively ( p > 0.05 ) . However , contraction times were decreased postoperatively ( pre : 73.6 + /- 52.9 s , post : 27.67 + /- 20.1 s , p < 0.05 ) and were found to be correlated with reflux time and the number of reflux episodes longer than 5 minutes . CONCLUSION Nissen fundoplication is effective for the treatment of GERD . It supports the anti-reflux mechanism without affecting esophageal motility except for contraction times . The decrease in contraction time after surgery can be explained by the decreases in reflux time and in the number of reflux episodes longer than 5 minutes . PM and EM confirmed the clinical improvement and can be used for the evaluation of results of NF Fundoplication remains a common operation in the brain-damaged pediatric patient , but recent reports suggest a poor outcome in these patients . The factors that might be associated with complications or recurrence after fundoplication have not been extensively examined . Fifty-six brain-damaged children , aged 6 months to 12 years , with documented gastroesophageal ( GE ) reflux underwent preoperative nutritional evaluations ( percentage of ideal weight , albumin , nutrition risk index [ NRI ] ) and documentation of medications ( dexamethasone for bronchopulmonary dysplasia ) before st and ard Nissen fundoplication . Hospital stay , intensive care unit ( ICU ) stay , and time on ventilator , as well as major postoperative complications ( wound infection/dehiscence , pneumonia ) were prospect ively analyzed . Survival and recurrence rates 1 to 3 years postoperatively were also assessed . Eighty-two percent of patients were < 90 % ideal weight , and 50 % had NRI < 90 ( normal = 100 ) and 29 % had albumin < 3.5 g/dL. Albumin < 3.5 was significantly ( P < .01 ) associated with prolonged hospitalization ( 26.8 + 2.2 versus 15.1 + 1.1 days ) and ICU stay ( 13.8 + 1.0 versus 4.4 + .5 days ) and time on ventilator ( 8.0 + 1.0 versus 1.8 + .4 days ) . NRI < 90 showed similar significant differences ( P < .01 ) . Ideal body weight < 90 % was not significant . Major complications developed in 54 % of patients ; only two or more preoperative nutritional deficiencies , or a nutritional deficiency plus dexamethasone were significantly associated ( P < .01 ) . Recurrence occurred in 21 % of patients and was significantly correlated with preoperative dexamethasone alone ( P < .01 ) , and especially when dexamethasone plus a nutritional deficit were present ( low albumin , P < .001 ; low NRI , P < .005 ) . ( ABSTRACT TRUNCATED AT 250 WORDS BACKGROUND The aim of this study was to compare short-term outcomes , including intra- and perioperative complications following laparoscopic Nissen versus Thal fundoplication . PATIENTS AND METHODS From July 1998 until April 2007 , 175 patients were recruited . Patients were prospect ively r and omized to either a Nissen wrap or a Thal wrap . Observation period was 6 weeks after surgery . RESULTS 89 Nissen and 86 Thal were performed . The mean age at the time of operation ( OP ) was 5.2 years . Demographics were similar , although weight at OP was significantly less in the Nissen group . Intraoperative complications during a Nissen included bleeding from a liver laceration in 2 patients ( 1 required conversion ) and small bowel perforation during open port insertion in 1 patient . There were two conversions in the Thal group , due to bleeding from the omentum in 1 patient and equipment failure in the other . In a third patient the colon was perforated during insertion of percutaneous endoscopic gastrostomy ( PEG ) and repaired laparoscopically . Post-OP dysphagia was similarly distributed among both groups , but was significantly more severe after a Nissen ( P = 0.018 ) . There were two early deaths : in the Nissen group , 1 child died from peritonitis after the gastrostomy tube fell out , whereas one death in the Thal group was caused by respiratory failure associated with the patient 's underlying condition . CONCLUSIONS There was no statistical difference in the short-term outcomes between laparoscopic Nissen and Thal fundoplication , apart from a higher rate of esophagoscopy for severe dysphagia in the Nissen group . The higher number of postoperative complications in the Nissen group was largely due to gastrostomy-related problems
11,067
22,346,345
The most common intervention strategy that pharmacists utilized was a combination of patient education and drug monitoring . This review suggests that pharmacist intervention is effective in the improvement of patient adherence to antidepressants .
BACKGROUND Pharmacist intervention in improving patient adherence to antidepressants is coupled with better outcomes . AIMS The aim of this investigation was to systematic ally examine the published literature to explore different types of pharmacist interventions used for enhancing patient adherence to antidepressant medications . Three specific questions guided the review : what is the impact of pharmacist interventions on adherence to antidepressant medication ? What is the impact of pharmacist interventions on patient-reported outcomes and patient satisfactions ? What are the types of interventions used by pharmacists to enhance patients ' adherence to antidepressants ?
OBJECTIVE To determine ( 1 ) whether telephone follow-up using a st and ardized telemonitoring tool can influence the nature and extent to which antidepressant users provide feedback to pharmacists , ( 2 ) whether patient characteristics are associated with the extent of patient feedback , and ( 3 ) how patient feedback affects subsequent outcomes after controlling for patient characteristics . DESIGN R and omized , controlled , experimental design . SETTING Eight Wisconsin community pharmacies within a large managed care organization . PATIENTS 60 patients presenting new antidepressant prescriptions . INTERVENTIONS Three monthly telephone calls from pharmacists providing structured education and monitoring . MAIN OUTCOME MEASURES Frequency of patient feedback to pharmacists , antidepressant knowledge , beliefs , percentage of missed doses , depression symptom scores , and perceptions of progress . RESULTS Compared with usual care patients ( n=32 ) , pharmacist-guided education and monitoring ( PGEM ) patients ( n=28 ) provided significantly more feedback to pharmacists regarding different aspects of their antidepressant therapy even after controlling for patient characteristics . Regression results also showed that patient feedback was significantly associated with greater antidepressant knowledge , positive antidepressant beliefs , and perceptions of progress after 3 months . Patient feedback was unrelated to nonadherence and depressive symptoms . CONCLUSION Structured education and monitoring by pharmacists significantly improves the level of patient feedback to pharmacists , and such feedback may help pharmacists identify and address their patients ' misconceptions , concerns , and progress with antidepressant therapy Among a sample of 119 distressed high-utilizers of primary care , 45 % of patients evaluated by a psychiatrist as needing antidepressant treatment had been treated in the year before the examination . However , only 11 % of the patients needing antidepressants had received adequate dosage and duration of pharmacotherapy . In the year following the intervention , study patients whose physicians were advised regarding treatment during a psychiatric consultation were more likely to receive antidepressant medications ( 52.7 % ) relative to a r and omized control group ( 36.1 % ) . However , the intervention did not significantly increase the provision of adequate antidepressant therapy ( 37.1 % vs 27.9 % ) . Among study patients using antidepressants , patient characteristics did not differentiate patients who received adequate dosage and duration of antidepressant medications from those who did not . Analysis of data on the duration of antidepressant therapy for all health maintenance organization enrollees initiating use of antidepressants showed that only 20 % of patients who had been given prescriptions for first-generation antidepressants ( amitriptyline , imipramine , or doxepin ) filled four or more prescriptions in the following six months , compared to 34 % of patients who had prescriptions for newer antidepressants ( nortriptyline , desipramine , trazodone and fluoxetine ) . Experimental research evaluating whether these newer medications ( with more favorable side effect profiles ) improve adherence , and thereby patient outcome , is needed OBJECTIVE The most common ways of assessing adherence to oral antipsychotic medications in research and in clinical practice are self-report and physician report . This prospect i ve study examined the agreement among measures of adherence to oral antipsychotic medications among 52 out patients with schizophrenia . METHODS Participants were assessed at baseline during a visit to their outpatient clinic and followed for 12 weeks . Adherence was assessed by using subjective measures ( self-report and physician report ) and objective measures ( pill counts conducted in the home , electronic monitoring , and blood plasma concentrations ) . Electronic monitoring was used as an imperfect st and ard against which other methods were judged . RESULTS Data from pill counts and from electronic monitoring were strongly correlated ( r(k)=.61 ) . Self-report and physicians ' ratings of compliance were weakly correlated with pill count and electronic monitoring when compliance scores were examined with rank-order correlations ( r(k)=.18-.32 ) . When the sample was dichotomized into adherent and nonadherent groups on the basis of electronic monitoring or pill count ( at least 80 % adherent ) , neither physicians nor patients identified adherent behavior ( kappa < or=20 ) . Blood plasma concentrations were not correlated with any other measures of adherence ( kappa < or=20 ) . Self-report and physician report were best correlated with clinical state ( r(k)=-.27 , r(k)=-.25 ) , suggesting that patients and treating professionals may use clinical state to estimate adherence . CONCLUSIONS Patients and physicians were not able to identify adherence . The inability of physicians to accurately identify adherent individuals is likely to have important consequences for prescribing behavior , health care costs , and patient outcomes The object of the study was to evaluate outcomes of a r and omized clinical trial ( RCT ) of a pharmacist intervention for depressed patients in primary care ( PC ) . We report antidepressant ( AD ) use and depression severity outcomes at 6-months . The RCT was conducted between 1998 and 2000 in 9 eastern Massachusetts PC practice s. We studied 533 patients with major depression and /or dysthymia as determined by a screening test done at the time of a routine PC office visit . The majority of participants had recurrent depressive episodes ( 63.5 % with > /=4 lifetime episodes ) , and 49.5 % were taking AD medications at enrollment . Consultation in person and by telephone was performed by a clinical pharmacist who assisted the primary care practitioner ( PCP ) and patient in medication choice , dose , and regimen , in accordance with AHCPR depression guidelines . Six-month AD use rates for intervention patients exceeded controls ( 57.5 % vs. 46.2 % , P = .03 ) . Furthermore , the intervention was effective in improving AD use rates for patients not on ADs at enrollment ( 32.3 % vs. 10.9 % , P = .001 ) . The pharmacist intervention proved equally effective in subgroups traditionally considered difficult to treat : those with chronic depression and dysthymia . Patients taking ADs had better modified Beck Depression Inventory ( mBDI ) outcomes than patients not taking ADs , ( -6.3 points change , vs. -2.8 , P = .01 ) but the outcome differences between intervention and control patients were not statistically significant ( 17.7 BDI points vs. 19.4 BDI points , P = .16 ) . Pharmacists significantly improved rates of AD use in PC patients , especially for those not on ADs at enrollment , but outcome differences were too small to be statistically significant . Difficult-to-treat subgroups may benefit from pharmacists ' care The objective of this systematic review was to evaluate the impact of pharmacist delivered community-based services to optimise the use of medications for mental illness . Twenty-two controlled ( r and omised and non-r and omised ) studies of pharmacists ' interventions in community and residential aged care setting s identified in international scientific literature were included for review . Papers were assessed for study design , service recipient , country of origin , intervention type , number of participating pharmacists , method ological quality and outcome measurement . Three studies showed that pharmacists ' medication counselling and treatment monitoring can improve adherence to antidepressant medications among those commencing treatment when calculated using an intention-to-treat analysis . Four trials demonstrated that pharmacist conducted medication review s may reduce the number of potentially inappropriate medications prescribed to those at high risk of medication misadventure . The results of this review provide some evidence that pharmacists can contribute to optimising the use of medications for mental illness in the community setting . However , more well design ed studies are needed to assess the impact of pharmacists as members of community mental health teams and as providers of comprehensive medicines information to people with schizophrenia and bipolar BACKGROUND The prevalence , long-term temporal consistency and factors influencing negative attitudes and poor treatment adherence among psychiatric patients with major depressive disorder ( MDD ) are not well known . METHODS In the Vantaa Depression Study ( VDS ) , a prospect i ve 5-year study of psychiatric patients with DSM-IV MDD , 238 ( 88.5 % ) patients ' attitudes towards and adherence to both antidepressants and psychotherapeutic treatments at baseline , 6 months , 18 months and 5 years was investigated . RESULTS Throughout the follow-up , most patients reported positive attitudes towards pharmacotherapy and psychosocial treatments , and good adherence . While attitudes became more critical over time , adherence to psychosocial treatment improved , but remained unchanged for pharmacotherapy . Employment predicted positive attitude ( OR=1.97 , 95 % CI 1.01 - 3.83 , P=0.046 ) , and larger social network good adherence ( OR=1.11 , 95 % CI 1.00 - 1.23 , P=0.042 ) to pharmacotherapy at the last follow-up . Cluster B personality disorder symptoms predicted negative attitude ( OR=0.82 , 95 % CI 0.70 - 0.96 , P=0.012 ) and poor adherence ( OR=0.83 , 95 % CI 0.72 - 0.95 , P=0.007 ) , but cluster C symptoms positive attitude ( OR=1.30 , 95 % CI 1.09 - 1.54 , P=0.003 ) , and living alone good adherence ( OR=3.13 , 95 % CI 1.10 - 9.09 , P=0.032 ) to psychosocial treatment . LIMITATIONS Patients may exaggerate their adherence to treatments . Attrition from follow-up may occur due to undetected negative change in treatment attitude or adherence . CONCLUSIONS Among psychiatric MDD patients in long-term follow-up , treatment attitudes and adherence to pharmaco- and psychotherapy were and remained mostly positive . They were significantly predicted by personality features and social support . Attention to adherence of those with cluster B personality disorders , or poor social support , may be needed Introduction The efficacy of antidepressants in the treatment of depression has been convincingly demonstrated in r and omised trials . However , non-adherence to antidepressant treatment is common . Objective To evaluate , from a societal perspective , the cost effectiveness of a pharmacy-based intervention to improve adherence to antidepressant therapy in adult patients receiving treatment in primary care . Methods An economic evaluation was performed alongside a 6-month r and omised controlled trial in The Netherl and s. Patients who came to 19 pharmacies with a new prescription for a non-tricyclic antidepressant , i.e. those who had not received any prescription for an antidepressant in the past 6 months , were invited to participate . They were then r and omly allocated to education and coaching by the pharmacist or to usual care . The coaching programme consisted of three contacts with the pharmacist , with a mean duration of between 13 and 20 minutes , and a take-home video review ing important facts on depression and antidepressant treatment . The clinical outcome measures were adherence to antidepressant treatment measured using an electronic pill container ( eDEM ) and improvement in depressive symptoms measured using the Hopkins Symptom Checklist ( SCL ) . Re source use was measured by means of question naires . The uncertainty around differences in costs and cost effectiveness between the treatment groups was evaluated using bootstrapping . Results Seventy patients were r and omised to the intervention group and 81 to the usual care group ; of these , 40 in the intervention group and 48 in the control group completed all of the follow-up question naires . There were no significant differences in adherence , improvements in the SCL depression mean item score and costs over 6 months between the two treatment groups . Mean total costs ( 2002 values ) were € 3275 in the intervention group and € 2961 in the control group ( mean difference € 315 ; 95 % CI —1922 , 2416 ) . The incremental cost-effectiveness ratio associated with the pharmacist intervention was € 149 per 1 % improvement in adherence and € 2550 per point improvement in the SCL depression mean item score . Cost-effectiveness planes and acceptability curves indicated that the pharmacist intervention was not likely to be cost effective compared with usual care . Conclusion In patients starting treatment with antidepressants , there were no significant differences in adherence , severity of depression , costs and cost effectiveness between patients receiving coaching by a pharmacist and patients receiving usual care after 6 months . Considering the re sources needed to implement an intervention like this in clinical practice , based on these results , the continuation of usual care is recommended OBJECTIVE Documentation and evaluation of patient outcomes in a pilot study into the role of rural community pharmacists in the management of depression . DESIGN Parallel groups design with a control and intervention group . SETTING Thirty-two community pharmacies in rural and remote New South Wales , Australia . PARTICIPANTS One hundred and six patient participants , mean age of 46 years , predominantly female , not currently employed , recruited by participating pharmacists . INTERVENTIONS Intervention pharmacists were given video-conference training on the nature and management of depression by a psychiatrist , psychologist and general practitioner and asked to dispense medication with extra advice and support . Control pharmacists were asked to provide usual care . MAIN OUTCOME MEASURES Adherence by self-report , K10 , Drug Attitude Index . RESULTS The results indicated that adherence to medications was high in both groups ( 95 % versus 96 % ) and that both groups had improved significantly in wellbeing ( a reduction K10 score of 4 ( control ) versus 4.7 ( intervention ) ) . No significant change was found in attitude to drug treatment once baseline scores were controlled for . CONCLUSIONS Because both groups improved in wellbeing it is not possible to cl aim that the training provided to the intervention pharmacists was responsible for the success . However , the improvements gained in such a short time ( two months ) suggest that the involvement of pharmacists has had a beneficial rather than negative effect . Further research into the most appropriate ways in which to integrate the skills of pharmacists into a model of mental health care delivery in rural communities is recommended To measure the effects of a collaborative care model that emphasized the role of clinical pharmacists in providing drug therapy management and treatment follow-up to patients with depression , we conducted a r and omized controlled trial at a staff model health maintenance organization . We compared the outcomes of subjects treated in this collaborative care model ( 75 patients , intervention group ) with subjects receiving usual care ( 50 patients , control group ) . After 6 months , the intervention group demonstrated a significantly higher drug adherence rate than that of the control group ( 67 % vs 48 % , odds ratio 2.17 , 95 % confidence interval 1.04 - 4.51 , p=0.038 ) . Patient satisfaction was significantly greater among members r and omly assigned to pharmacists ' services than among controls , and provider satisfaction surveys revealed high approval rates as well . Changes in re source utilization were favorable for the intervention group , but differences from the control group did not achieve statistical significance . Clinical improvement was noted in both groups , but the difference was not significant . Clinical pharmacists had a favorable effect on multiple aspects of patient care . Future studies of this model in other health care setting s appear warranted PURPOSE The impact of pharmacist interventions on the care and outcomes of patients with depression in a primary care setting was evaluated . METHODS Patients diagnosed with a new episode of depression and started on anti-depressant medications were r and omized to enhanced care ( EC ) or usual care ( UC ) for one year . EC consisted of a pharmacist collaborating with primary care providers to facilitate patient education , the initiation and adjustment of antidepressant dosages , the monitoring of patient adherence to the regimen , the management of adverse reactions , and the prevention of relapse . The patients in the UC group served as controls . Outcomes were measured by the Hopkins Symptom Checklist , Diagnostic and Statistical Manual of Mental Disorders , Fourth Edition , criteria for major depression , health-related quality of life , medication adherence , patient satisfaction , and use of depression-related health care services . An intent-to-treat analysis was used . RESULTS Seventy-four patients were r and omized to EC or UC . At baseline , the EC group included more patients diagnosed with major depression than did the UC group ( p = 0.04 ) . All analyses were adjusted for this difference . In both groups , mean scores significantly improved from baseline for symptoms of depression and quality of life at three months and were maintained for one year . There were no statistically significant differences between treatment groups in depression symptoms , quality of life , medication adherence , provider visits , or patient satisfaction . CONCLUSION Frequent telephone contacts and interventions by pharmacists and UC in a primary care setting result ed in similar rates of adherence to antidepressant regimens and improvements in the outcomes of depression at one year Pain is common in cancer patients . To ensure optimal pain management efficacy and effectiveness of new drugs and treatments have to be investigated in clinical trials . Efficacy trials such as r and omised controlled trials ( RCT ) are experimental studies and estimate the maximum potential benefit to be derived from an intervention in ideal circumstances and under a controlled environment . RCTs are the only trial design to establish causal effects . A crossover study is a special type of RCT where patients serve as own controls . In efficacy studies the intervention and the control group should be as homogeneous as possible , confounding variables are controlled , bias is reduced , internal validity is high whereas external validity is low . Studies looking at effectiveness assess clinical practice and reflect real life circumstances . They rely high on external validity at the expense of careful controls , the study population is heterogeneous , confounding variables are examined . Cohort studies follow a group or groups of individuals with a common characteristic over a period of time to measure outcomes . Case-control studies start with the outcome and compare the characteristics of two groups of interest , those with the outcome and those without to identify factors which occur more or less often in the poor outcome group . Definition of outcome criteria is crucial both for efficacy and effectiveness studies and is often a primary problem . All clinical studies must use valid and reliable outcome measures We performed a r and omized trial to prevent depression relapse in primary care by evaluating intervention effects on medication attitudes and self-management of depression . Three hundred and eighty six primary care patients at high risk for recurrent depression were r and omized to receive a 12-month intervention . Interviews at baseline , 3 , 6 , 9 , and 12-months assessed attitudes about medication , confidence in managing side effects , and depression self-management . This depression relapse prevention program significantly increased : 1 ) favorable attitudes toward antidepressant medication [ Beta = .26 , 95 % C.I. = ( .18,.33 ) ] ; 2 ) self-confidence in managing medication side effects [ Beta = .53 , 95 % C.I. = ( .15,.91 ) ] ; 3 ) depressive symptom monitoring [ O.R. = 4.08 , 95 % C.I. = ( 2.80 , 5.94 ) ] ; 4 ) checking for early warning signs [ O.R. = 3.27 , 95 % C.I. = ( 2.32 , 4.61 ) ] ; and , 5 ) planful coping [ O.R. = 2.01 , 95 % C.I. = ( 1.49 , 2.72 ) ] . Significant predictors of adherence to long-term pharmacotherapy were : favorable attitudes toward antidepressant treatment [ OR = 2.20 , 95 % CI = ( 1.50 , 3.22 ) ] , and increased confidence in managing medication side effects [ OR = 1.10 , 95 % CI = ( 1.04 , 1.68 ) ] . Among primary care patients at high risk for depression relapse , enhanced attitudes towards antidepressant medicines and higher confidence in managing side effects were key factors associated with greater adherence to maintenance pharmacotherapy OBJECTIVE To examine the effects of pharmacist monitoring on patient satisfaction with and adherence to antidepressant medication therapy . DESIGN In this prospect i ve field study , we interviewed patients starting an antidepressant after a new prescription was dispensed and again 2 months later . The first interview assessed patients ' characteristics , antidepressant medication history , knowledge of antidepresant medications and their use , and beliefs about antidepressant medications . The second interview focused on pharmacist monitoring behavior and satisfaction with the antidepressant medication . SETTING AND PARTICIPANTS From 23 community pharmacies , we enrolled 100 patients , 59 of whom were taking an antidepressant for the first time . MAIN OUTCOME MEASURES Patient satisfaction with and reported adherence to their antidepressant medication regimen . RESULTS Pharmacist monitoring of patients ' antidepressant medication use varied . More than 70 % of patients reported that pharmacists asked about medication concerns ; 53 % and 54 % of patients , respectively , said pharmacists encouraged their questions and listened to their concerns ; and 32 % found pharmacists helpful in solving problems related to the antidepressant . Fifty-seven percent of patients reported feeling better a lot of the time since taking the antidepressant , 40 % said the antidepressant did not bother them , and 83 % reported missing doses , adding doses , or stopping the antidepressant during the study period . Initial beliefs about antidepressants were a strong predictor of patient outcomes . Pharmacist monitoring was predictive of satisfaction and adherence for individuals taking an antidepressant for the first time . CONCLUSION Pharmacists can play a critical role in monitoring medication concerns at the beginning of use , allowing for problem solving , reinforcement , and greater patient satisfaction with and adherence to medication therapy . Obstacles to effective pharmacist monitoring and follow-up need to be identified and addressed in future improvement efforts OBJECTIVE Many depressed patients have negative beliefs about antidepressants , leading to poor adherence , unfavorable depression outcome , and low perceived well-being , role functioning , and quality of life . Interventions to ameliorate beliefs are therefore needed . METHOD In a cluster-r and omized controlled trial conducted from September 1999 to January 2001 , 2 interventions to improve management of major depressive disorder in primary care were compared : ( 1 ) a depression care program ( DCP ) , providing enhanced patient education , stimulation of active participation of general practitioners and patients in the treatment process , discussion of benefits and costs of taking antidepressant medication , and systematic follow-up and ( 2 ) a systematic follow-up program ( SFP ) . Thirty general practitioners were r and omly assigned , and 211 patients with current major depressive disorder ( diagnosed according to DSM-IV ) were included . All patients were prescribed a selective serotonin reuptake inhibitor . Beliefs were assessed at baseline , at week 10 , and at week 26 . Differences in change of beliefs between DCP and SFP groups were analyzed . RESULTS Changes in patients ' beliefs were more favorable in the DCP condition at week 10 and week 26 , compared with SFP only ( beliefs concerning appropriate medication-taking , week 10 : effect size = 0.39 , p = .012 ; week 26 : effect size = 0.55 , p = .001 ; beliefs concerning harmfulness , week 10 : effect size = 0.45 , p = .011 ; week 26 : effect size = 0.62 , p = .002 ) . CONCLUSION The depression care program ameliorates beliefs about antidepressants in primary care patients with major depressive disorder . The study results encourage the implementation of a depression care program in order to improve beliefs about antidepressant medication in primary care patients diagnosed with major depressive disorder The effects on adherence and depressive symptoms of a community pharmacy-based coaching program , including a take-home videotape , were evaluated in a r and omized controlled trial in the Netherl and s. A total of 147 depressed primary care patients who had a new antidepressant prescription were included in the study . Adherence was measured with an electronic pill container and was also derived from pharmacy medication records ; the latter method was associated with an overestimation of adherence of only 5 percent . Intention-to-treat analyses showed no intervention effect on adherence ( 73 percent compared with 76 percent ) , whereas analyses of patients who received the intervention ( per protocol ) showed improved adherence ( 73 percent compared with 90 percent ) . Neither analysis showed effects on depressive symptoms OBJECTIVE To explore the impact of telephone-based education and monitoring by community pharmacists on multiple outcomes of pharmacist-patient collaboration . DESIGN A r and omized , controlled , unblinded , mixed experimental design . SETTING Eight Wisconsin community pharmacies within a large managed care organization . PATIENTS A total of 63 patients presenting new antidepressant prescriptions to their community pharmacies . INTERVENTIONS Patients were r and omized to receive either three monthly telephone calls from pharmacists providing pharmacist-guided education and monitoring ( PGEM ) or usual pharmacist 's care . Usual care is defined as that education and monitoring which pharmacists may typically provide patients at the study pharmacies . MAIN OUTCOME MEASURES Patient 's frequency of feedback with the pharmacist , antidepressant knowledge , antidepressant beliefs , antidepressant adherence at 3 and 6 months , improvement in depression symptoms , and orientation toward treatment progress . RESULTS Of the 60 patients who completed the study , 28 received PGEM and 32 received usual pharmacist 's care . Results showed that PGEM had a significant and positive effect on patient feedback , knowledge , medication beliefs , and perceptions of progress . There were no significant group differences in patient adherence or symptoms at 3 months ; however , PGEM patients who completed the protocol missed fewer doses than did the usual care group at 6 months ( P < or = .05 ) . CONCLUSION Antidepressant telemonitoring by community pharmacists can significantly and positively affect patient feedback and collaboration with pharmacists . Longer-term studies with larger sample s are needed to assess the generalizability of findings . Future research also needs to explore additional ways to improve clinical outcomes Background Treatment of depression , the most prevalent and costly mental disorder , needs to be improved . Non-concordance with clinical guidelines and non-adherence can limit the efficacy of pharmacological treatment of depression . Through pharmaceutical care , pharmacists can improve patients ' compliance and wellbeing . The aim of this study is to evaluate the effectiveness and cost-effectiveness of a community pharmacist intervention developed to improve adherence and outcomes of primary care patients with depression . Methods / design A r and omized controlled trial , with 6-month follow-up , comparing patients receiving a pharmaceutical care support programme in primary care with patients receiving usual care . The total sample comprises 194 patients ( aged between 18 and 75 ) diagnosed with depressive disorder in a primary care health centre in the province of Barcelona ( Spain ) . Subjects will be asked for written informed consent in order to participate in the study . Diagnosis will be confirmed using the SCID-I. The intervention consists of an educational programme focused on improving knowledge about medication , making patients aware of the importance of compliance , reducing stigma , reassuring patients about side-effects and stressing the importance of carrying out general practitioners ' advice . Measurements will take place at baseline , and after 3 and 6 months . Main outcome measure is compliance with antidepressants . Secondary outcomes include ; clinical severity of depression ( PHQ-9 ) , anxiety ( STAI-S ) , health-related quality of life ( EuroQol-5D ) , satisfaction with the treatment received , side-effects , chronic physical conditions and socio-demographics . The use of healthcare and social care services will be assessed with an adapted version of the Client Service Receipt Inventory ( CSRI ) . Discussion This trial will provide valuable information for health professionals and policy makers on the effectiveness and cost-effectiveness of a pharmaceutical intervention programme in the context of primary care . Trial registration
11,068
25,635,194
Conclusions Although SCT is an interesting test to evaluate clinical decision-making in emergency medicine , our results raise concerns regarding whether the judgments of an expert panel are sufficiently valid as the reference st and ard for this test
Background We aim ed to compare the clinical judgments of a reference panel of emergency medicine academic physicians against evidence -based likelihood ratios ( LRs ) regarding the diagnostic value of selected clinical and para clinical findings in the context of a script concordance test ( SCT ) .
OBJECTIVE We sought to develop a valid , reliable assessment of intraoperative judgment by residents during gynecologic surgery based on Script Concordance Theory . STUDY DESIGN This was a multicenter prospect i ve study involving 5 obstetrics and gynecology residency programs . Surgeons from each site generated case scenarios based on common gynecologic procedures . Construct validity was evaluated by correlating scores to training level , in-service examinations , and surgical skill and experience using a Global Rating Scale of Operative Performance and case volumes . RESULTS A final test that included 42 case scenarios was administered to 75 residents . Internal consistency ( Cronbach alpha = 0.73 ) and test-retest reliability ( Lin correlation coefficient = 0.76 ) were good . There were significant differences between test scores and training levels ( P = .002 ) and test scores correlated with in-service examination scores ( r = 0.38 ; P = .001 ) . There was no association between test scores and total number of cases or technical skills . CONCLUSION The Script Concordance Test appears to be a reliable , valid assessment tool for intraoperative decision-making during gynecologic surgery Aim : To examine which response options children prefer and which they find easiest to use , and to study the relative reliability of the different response options . Methods : A consecutive group of unselected children ( n= 120 ) filled out three question naires in a paediatric outpatient clinic . Each question naire included seven similar questions , but had different response options : the Likert scale , the Visual Analogue Scale ( VAS ) and the numeric VAS . In general , the questions were not related to the children 's particular diseases , but dealt with the frequency of simple activities , their feelings and opinions . The pages with the three different response options were offered in r and om order . Afterwards , the children rated their preference and ease of use of the different response options on a scale from one to 10 . Results : Children preferred the Likert scale ( median mark 9.0 ) over the numeric VAS ( median mark 8.0 ) and the simple VAS ( median 6.0 ) . They considered the Likert scale easiest to fill out ( median mark 10 vs 9 and 7.5 for the numeric and simple VAS , respectively ) . Results of the different response options correlated strongly with each other ( rho = 0.67–0.90 , p < 0.05 ) Many controlled trials rely on subjective measures of symptoms or quality of life as primary outcomes . The relative merits of different response options for these measures is an important , but largely unexplored , issue . Therefore , we compared the responsiveness of seven-point Likert vs visual analogue scales ( VAS ) in a question naire measuring quality of life in chronic lung disease . The VAS and seven-point scale versions of the question naire were administered to 28 patients before and after completing an inpatient respiratory rehabilitation program of known benefit . For all four dimensions of the question naire ( dyspnea , fatigue , emotional function , and mastery ) the VAS showed a larger improvement than the seven-point scale when both were st and ardized on a scale of 0 - 10 . However , in each case the variability of the improvement was greater using the VAS . The difference in improvement between the two scales was not statistically significant . We conclude that the two methods of presenting response options show comparable responsiveness . The ease of administration and interpretation of the seven-point scale recommend its use in clinical trials OBJECTIVE To report on the creation and administration of an online Script Concordance Test ( SCT ) for ear , nose , and throat ( ENT ) , the ENT-SCT . DESIGN Prospect i ve study . SETTING Two tertiary care university centers . PARTICIPANTS In total , 132 individuals were asked to test an ENT-SCT of 20 cases and 94 questions based on the major educational objectives of the ENT residency program . MAIN OUTCOME MEASURES Three levels of experience were tested : medical students , ENT residents , and board-certified otorhinolaryngologists as the expert panel . The test 's construct validity-whether scores were related to clinical experience-was statistically analyzed . Reliability was estimated by the Cronbach α internal consistency coefficient . Participants ' perception of the test was assessed with the use of a question naire . RESULTS The 65 respondents with usable data were medical students ( n = 21 ) , ENT residents ( n = 22 ) , and experts ( n = 22 ) . Total mean ( SD ) test scores differed significantly : 76.81 ( 3.31 ) for the expert panel , 69.05 ( 4.35 ) for residents , and 58.29 ( 5.86 ) for students . The Cronbach α coefficient was 0.95 . More than two-thirds of the participants found the test to be realistic and relevant for assessing clinical reasoning . The test was also considered fun , interesting , and intuitive . CONCLUSIONS The Web-based ENT-SCT is feasible , reliable , and useful for assessing clinical reasoning . This online assessment tool may have applications for residency programs and continuing medical education
11,069
26,557,880
The addition of MA to CT for patients with metastatic colorectal cancer does not prolong GS and PFS
BACKGROUND The effectiveness of chemotherapy ( CT ) for select cases of metastatic colorectal cancer ( MCRC ) has been well established in the literature , however , it provides limited benefits and in many cases constitutes a treatment with high toxicity . The use of specific molecular biological treatments with monoclonal antibodies ( MA ) has been shown to be relevant , particularly for its potential for increasing the response rate of the host to the tumour , as these have molecular targets present in the cancerous cells and their microenvironment thereby blocking their development . The combination of MA and CT can bring a significant increase in the rate of resectability of metastases , the progression-free survival ( PFS ) , and the global survival ( GS ) in MCRC patients . OBJECTIVE To assess the effectiveness and safety of MA in the treatment of MCRC .
We present the preliminary toxicity data from the MRC COIN trial , a phase III r and omised controlled trial of first-line therapy in advanced colorectal cancer , with particular reference to the addition of cetuximab to an oxaliplatin – fluoropyrimidine combination . A total of 804 patients were r and omised between March 2005 and July 2006 from 78 centres throughout the United Kingdom . Patients were allocated to oxaliplatin plus fluoropyrimidine chemotherapy with or without the addition of weekly cetuximab . The choice of fluoropyrimidine ( either 5-fluorouracil ( 5FU ) or capecitabine ) was decided by the treating physician and patient before r and omisation . Toxicity data were collected from all patients . Two hundred and three patients received 5FU plus oxaliplatin ( OxMdG , 25 % ) , 333 oxaliplatin+capecitabine ( Xelox , 41 % ) , 102 received OxMdG+cetuximab ( OxMdG+C , 13 % ) and 166 Xelox+cetuximab ( 21 % ) . Percent grade 3/4 toxicities included diarrhoea 6 , 15 , 13 and 25 % , nausea/vomiting 3 , 7 , 7 and 14 % for OxMdG , Xelox , OxMdG+C and Xelox+C , respectively . Sixty-day all-cause mortality was 6 , 5 , 5 and 7 % . Statistically significant differences were evident for patients receiving Xelox+cetuximab vs Xelox alone : diarrhoea relative risk ( RR ) 1.69 ( 1.17 , 2.43 , P=0.005 ) and nausea/vomiting RR 2.01 ( 1.16 , 3.47 , P=0.012 ) . The excess toxicity observed in the oxaliplatin- , capecitabine- , cetuximab-treated patients led the trial management group to conclude that a capecitabine dose adjustment was required to maintain safety levels when using this regimen BACKGROUND Targeting the vascular endothelial growth factor or the epidermal growth factor receptor ( EGFR ) has shown efficacy in advanced colorectal cancer ( ACC ) , but no data are available on the combination of these strategies with chemotherapy in the first-line treatment . The CAIRO2 study evaluates the effect of adding cetuximab , a chimeric mAb against EGFR , to capecitabine , oxaliplatin and bevacizumab in the first-line treatment of ACC . PATIENTS AND METHODS In all , 755 patients were r and omly assigned between treatment with capecitabine , oxaliplatin and bevacizumab with or without cetuximab . The primary end point is progression-free survival . We here present the toxicity results in the first 400 patients that entered the study . RESULTS The incidence of overall grade 3 - 4 toxicity was significantly higher in arm B compared with arm A ( 81 % versus 72 % , P = 0.03 ) . This difference is fully attributed to cetuximab-related skin toxicity . The addition of cetuximab did not result in an increase of gastrointestinal toxicity or treatment-related mortality . CONCLUSIONS The addition of cetuximab to capecitabine , oxaliplatin and bevacizumab in the first-line treatment of ACC appears to be safe and feasible . No excessive or unexpected toxicity in the cetuximab-containing treatment arm was observed Summary Background In the Medical Research Council ( MRC ) COIN trial , the epidermal growth factor receptor (EGFR)-targeted antibody cetuximab was added to st and ard chemotherapy in first-line treatment of advanced colorectal cancer with the aim of assessing effect on overall survival . Methods In this r and omised controlled trial , patients who were fit for but had not received previous chemotherapy for advanced colorectal cancer were r and omly assigned to oxaliplatin and fluoropyrimidine chemotherapy ( arm A ) , the same combination plus cetuximab ( arm B ) , or intermittent chemotherapy ( arm C ) . The choice of fluoropyrimidine therapy ( capecitabine or infused fluouroracil plus leucovorin ) was decided before r and omisation . R and omisation was done central ly ( via telephone ) by the MRC Clinical Trials Unit using minimisation . Treatment allocation was not masked . The comparison of arms A and C is described in a companion paper . Here , we present the comparison of arm A and B , for which the primary outcome was overall survival in patients with KRAS wild-type tumours . Analysis was by intention to treat . Further analyses with respect to NRAS , BRAF , and EGFR status were done . The trial is registered , IS RCT N27286448 . Findings 1630 patients were r and omly assigned to treatment groups ( 815 to st and ard therapy and 815 to addition of cetuximab ) . Tumour sample s from 1316 ( 81 % ) patients were used for somatic molecular analyses ; 565 ( 43 % ) had KRAS mutations . In patients with KRAS wild-type tumours ( arm A , n=367 ; arm B , n=362 ) , overall survival did not differ between treatment groups ( median survival 17·9 months [ IQR 10·3–29·2 ] in the control group vs 17·0 months [ 9·4–30·1 ] in the cetuximab group ; HR 1·04 , 95 % CI 0·87–1·23 , p=0·67 ) . Similarly , there was no effect on progression-free survival ( 8·6 months [ IQR 5·0–12·5 ] in the control group vs 8·6 months [ 5·1–13·8 ] in the cetuximab group ; HR 0·96 , 0·82–1·12 , p=0·60 ) . Overall response rate increased from 57 % ( n=209 ) with chemotherapy alone to 64 % ( n=232 ) with addition of cetuximab ( p=0·049 ) . Grade 3 and higher skin and gastrointestinal toxic effects were increased with cetuximab ( 14 vs 114 and 67 vs 97 patients in the control group vs the cetuximab group with KRAS wild-type tumours , respectively ) . Overall survival differs by somatic mutation status irrespective of treatment received : BRAF mutant , 8·8 months ( IQR 4·5–27·4 ) ; KRAS mutant , 14·4 months ( 8·5–24·0 ) ; all wild-type , 20·1 months ( 11·5–31·7 ) . Interpretation This trial has not confirmed a benefit of addition of cetuximab to oxaliplatin-based chemotherapy in first-line treatment of patients with advanced colorectal cancer . Cetuximab increases response rate , with no evidence of benefit in progression-free or overall survival in KRAS wild-type patients or even in patients selected by additional mutational analysis of their tumours . The use of cetuximab in combination with oxaliplatin and capecitabine in first-line chemotherapy in patients with widespread metastases can not be recommended . Funding Cancer Research UK , Cancer Research Wales , UK Medical Research Council , Merck BACKGROUND Assessment of the change in tumour burden is an important feature of the clinical evaluation of cancer therapeutics : both tumour shrinkage ( objective response ) and disease progression are useful endpoints in clinical trials . Since RECIST was published in 2000 , many investigators , cooperative groups , industry and government authorities have adopted these criteria in the assessment of treatment outcomes . However , a number of questions and issues have arisen which have led to the development of a revised RECIST guideline ( version 1.1 ) . Evidence for changes , summarised in separate papers in this special issue , has come from assessment of a large data warehouse ( > 6500 patients ) , simulation studies and literature review s. HIGHLIGHTS OF REVISED RECIST 1.1 : Major changes include : Number of lesions to be assessed : based on evidence from numerous trial data bases merged into a data warehouse for analysis purpose s , the number of lesions required to assess tumour burden for response determination has been reduced from a maximum of 10 to a maximum of five total ( and from five to two per organ , maximum ) . Assessment of pathological lymph nodes is now incorporated : nodes with a short axis of 15 mm are considered measurable and assessable as target lesions . The short axis measurement should be included in the sum of lesions in calculation of tumour response . Nodes that shrink to < 10 mm short axis are considered normal . Confirmation of response is required for trials with response primary endpoint but is no longer required in r and omised studies since the control arm serves as appropriate means of interpretation of data . Disease progression is clarified in several aspects : in addition to the previous definition of progression in target disease of 20 % increase in sum , a 5 mm absolute increase is now required as well to guard against over calling PD when the total sum is very small . Furthermore , there is guidance offered on what constitutes ' unequivocal progression ' of non-measurable/non-target disease , a source of confusion in the original RECIST guideline . Finally , a section on detection of new lesions , including the interpretation of FDG-PET scan assessment is included . Imaging guidance : the revised RECIST includes a new imaging appendix with up date d recommendations on the optimal anatomical assessment of lesions . FUTURE WORK A key question considered by the RECIST Working Group in developing RECIST 1.1 was whether it was appropriate to move from anatomic unidimensional assessment of tumour burden to either volumetric anatomical assessment or to functional assessment with PET or MRI . It was concluded that , at present , there is not sufficient st and ardisation or evidence to ab and on anatomical assessment of tumour burden . The only exception to this is in the use of FDG-PET imaging as an adjunct to determination of progression . As is detailed in the final paper in this special issue , the use of these promising newer approaches requires appropriate clinical validation studies BACKGROUND The growing availability of active agents makes the development of novel therapies increasingly complex and the choice of end points critical . We assessed the frequency of use of efficacy end points in advanced breast cancer . METHODS We search ed PubMed for r and omized trials published between 2000 and 2007 in 10 leading medical journals . We abstract ed data on progression-free survival ( PFS ) , time to tumor progression ( TTP ) , response rate ( RR ) and overall survival . RESULTS A total of 58 studies enrolled 23,371 assessable patients in 122 treatment arms . The primary end points most frequently used were RR and TTP ( n=21 each ) , followed by PFS ( n=14 ) . In five of the trials using TTP as the primary end point , no definition of TTP was reported ; in 13 of the other 16 cases , death was counted as an event , making TTP indistinguishable from PFS . Trials having PFS , TTP or time to treatment failure as the primary end point ( n=36 ) had a higher mean number of patients than those using RR ( P=0.061 ) . CONCLUSION Investigators seem to be frequently using PFS and TTP interchangeably in advanced breast cancer . Such use of terms may lead to confusion when results of different trials are compared , and uniform use of definitions seems in order PURPOSE To compare the efficacy of cediranib ( a vascular endothelial growth factor receptor tyrosine kinase inhibitor [ VEGFR TKI ] ) with that of bevacizumab ( anti-VEGF-A monoclonal antibody ) in combination with chemotherapy as first-line treatment for advanced metastatic colorectal cancer ( mCRC ) . PATIENTS AND METHODS HORIZON III [ Cediranib Plus FOLFOX6 Versus Bevacizumab Plus FOLFOX6 in Patients With Untreated Metastatic Colorectal Cancer ] had an adaptive phase II/III design . Patients r and omly assigned 1:1:1 received mFOLFOX6 [ oxaliplatin 85 mg/m(2 ) and leucovorin 400 mg/m(2 ) intravenously followed by fluorouracil 400 mg/m(2 ) intravenously on day 1 and then continuous infusion of 2,400 mg/m(2 ) over the next 46 hours every 2 weeks ] with cediranib ( 20 or 30 mg per day ) or bevacizumab ( 5 mg/kg every 14 days ) . An independent end-of-phase II analysis concluded that mFOLFOX6/cediranib 20 mg met predefined criteria for continuation ; subsequent patients received mFOLFOX6/cediranib 20 mg or mFOLFOX6/bevacizumab ( r and omly assigned 1:1 ) . The primary objective was to compare progression-free survival ( PFS ) . RESULTS In all , 1,422 patients received mFOLFOX6/cediranib 20 mg ( n = 709 ) or mFOLFOX6/bevacizumab ( n = 713 ) . Primary analysis revealed no significant difference between arms for PFS ( hazard ratio [ HR ] , 1.10 ; 95 % CI , 0.97 to 1.25 ; P = .119 ) , overall survival ( OS ; HR , 0.95 ; 95 % CI , 0.82 to 1.10 ; P = .541 ) , or overall response rate ( 46.3 % v 47.3 % ) . Median PFS and OS were 9.9 and 22.8 months for mFOLFOX6/cediranib and 10.3 and 21.3 months for mFOLFOX6/bevacizumab . The PFS upper 95 % CI was outside the predefined noninferiority limit ( HR < 1.2 ) . Common adverse events with more than 5 % incidence in the cediranib arm included diarrhea , neutropenia , and hypertension . Cediranib-treated patients completed fewer chemotherapy cycles than bevacizumab-treated patients ( median 10 v 12 cycles ) . Patient-reported outcomes ( PROs ) were significantly less favorable in cediranib-treated versus bevacizumab-treated patients ( P < .001 ) . CONCLUSION Cediranib activity , in terms of PFS and OS , was comparable to that of bevacizumab when added to mFOLFOX6 ; however , the predefined boundary for PFS noninferiority was not met . The cediranib safety profile was consistent with previous studies but led to less favorable PROs compared with bevacizumab . Investigation of oral TKIs in CRC continues Objective : The objective of this phase III trial was to compare chemotherapy combined with bevacizumab versus chemotherapy alone in the treatment of patients with advanced colorectal cancer . Methods : From September 2004 till September 2008 , 222 treatment-naive patients were enrolled and divided into 2 arms : 114 arm A patients were treated with leucovorin , 5-fluorouracil plus irinotecan in combination with bevacizumab , and 108 arm B patients were treated as above without bevacizumab . All patients were stage IV with histologically confirmed adenocarcinoma . Results : The median overall survival of arm A patients was 22.0 months ( 95 % CI : 18.1–25.9 ) and 25.0 months ( CI : 18.1–31.9 ) for arm B patients . There was no statistically significant difference between the 2 arms ( p = 0.1391 ) . No statistically significant difference between the 2 arms regarding the response ratewas observed : partial response , 42 patients ( 36.8 % ) and 38 patients ( 35.2 % ) for arms A and B , respectively . Hematologic toxicity did not differ in the comparison of the 2 arms . Nonhematologic toxicity in arm A involved hypertension in 23 ( 20.2 % ) of the patients and proteinuria in 7 ( 6.1 % ) ; 3 patients experienced hemorrhage and 1 patient intestinal perforation . None of these side effects was observed in arm B patients . Conclusion : No statistically significant difference in median overall survival in patients with advanced colorectal cancer treated with bevacizumab plus a combination therapy ( arm A ) and those treated with the combination only , without bevacizumab ( arm B ) , was observed This is a phase II institutional exploratory trial of biweekly irinotecan and cetuximab administration regimen in metastatic colorectal cancer patients progressing to at least one previous chemotherapy line . A total of 40 patients were treated between November 2005 and November 2007 with irinotecan 180 mg m−2 and cetuximab 500 mg m−2 q2w ( every 2 weeks ) , in every 21-day cycles , until unacceptable toxicity or progressive disease . An overall response rate of 22.5 % was obtained ( two complete and seven partial responses ) . The disease control rate was 60 % . The time to progression was 3.4 months and the overall survival was 8 months . The toxicity compared very favourably to weekly cetuximab combination schedules . Grade 3/4 adverse effects were observed in 12 patients . Overall , our results turn up very similar both in terms of toxicity and efficacy to those obtained by weekly and biweekly administration regimens AIM To investigate the efficacy and safety of cape-citabine plus irinotecan + /- bevacizumab in advanced or metastatic colorectal cancer patients . METHODS Forty six patients with previously untreated , locally-advanced or metastatic colorectal cancer ( mCRC ) were recruited between 2001 - 2006 in a prospect i ve open-label phase II trial , in German community-based outpatient clinics . Patients received a st and ard capecitabine plus irinotecan ( CAPIRI ) or CAPIRI plus bevacizumab ( CAPIRI-BEV ) regimen every 3 wk . Dose reductions were m and atory from the first cycle in cases of > grade 2 toxicity . The treatment choice of bevacizumab was at the discretion of the physician . The primary endpoints were response and toxicity and secondary endpoints included progression-free survival and overall survival . RESULTS In the CAPIRI group vs the CAPRI-Bev group there were more female than male patients ( 47 % vs 24 % ) , and more patients had colon as the primary tumor site ( 58.8 % vs 48.2 % ) with fewer patients having sigmoid colon as primary tumor site ( 5.9 % vs 20.7 % ) . Grade 3/4 toxicity was higher with CAPIRI than CAPIRI-Bev : 82 % vs 58.6 % . Partial response rates were 29.4 % and 34.5 % , and tumor control rates were 70.6 % and 75.9 % , respectively . No complete responses were observed . The median progression-free survival was 11.4 mo and 12.8 mo for CAPIRI and CAPIRI-Bev , respectively . The median overall survival for CAPIRI was 15 mo ( 458 d ) and for CAPIRI-Bev 24 mo ( 733 d ) . These differences were not statistically different . In the CAPIRI-Bev , group , two patients underwent a full secondary tumor resection after treatment , whereas in the CAPIRI group no cases underwent this procedure . CONCLUSION Both regimens were well tolerated and offered effective tumor growth control in this outpatient setting . Severe gastrointestinal toxicities and thromboembolic events were rare and if observed were never fatal Objectives : FOLFOX-4 and FOLFIRI are considered equivalent in terms of activity and efficacy as first-line chemotherapy in metastatic colorectal cancer ( mCRC ) . The monoclonal antibody ( mAb ) cetuximab showed intrinsic activity as a single agent in mCRC and was approved in combination with CPT-11 for patients who failed previous CPT-11-based treatment . The purpose of this phase II study was to evaluate the activity and safety of FOLFOX-4 plus cetuximab in untreated mCRC patients . Methods : Untreated patients with measurable metastatic disease and expressing epidermal growth factor receptor ( EGFR ) received cetuximab at a loading dose of 400 mg/m2 , followed by weekly doses of 250 mg/m2 , in combination with the FOLFOX-4 regimen every 2 weeks for a maximum of 12 cycles , after which a maintenance program using cetuximab alone was allowed for a maximum of 6 months . Results : Eighty-two unselected patients were screened ; 70 were EGFR+ and entered the trial . Of the 67 assessable patients , the objective response rate was 64.2 % ( 95 % CI : 52.5–75.5 % ) and the tumor growth control rate was 94 % ( 95 % CI : 88–99 % ) . All the objective responses except 1 were confirmed . In the group of patients with initially unresectable liver disease alone , 7/33 ( 21 % ) were resected . The median time to progression ( TTP ) and overall survival ( OS ) were 10.0 and 22.0 months , respectively . The treatment was well tolerated , with no treatment-related deaths , while 24.2 % of the patients were affected by cutaneous toxicity of grade > 2 . Mutational analysis of the KRAS and BRAF genes was retrospectively performed on 35 of the 69 patients treated with cetuximab ( 51 % ) . KRAS was mutated in 13 out of the 35 cases ( 37 % ) , whereas no mutations were detected in the BRAF gene . A trend toward an association between KRAS mutations and objective response to treatment ( p = 0.07 ) was demonstrated . Analysis of survival showed that patients harboring KRAS mutations had a trend toward worst TTP ( p = 0.14 ) confirmed by age- and sex-adjusted Cox multivariate regression ( hazard ratio , HR = 0.62 ; 95 % CI : 0.36–1.06 ; p = 0.08 ) . Indeed , KRAS mutations were significantly associated with worst OS in both unadjusted analysis ( p = 0.047 ; log rank test ) and age- and sex-adjusted Cox multivariate regression ( HR = 0.458 ; 95 % CI : 0.248–0.847 ; p = 0.01 ) . Conclusions : These results suggest that the combination of FOLFOX-4 plus cetuximab is very active and obtains long TTP with an acceptable toxicity profile . Indeed , our results are in line with recent findings from phase II and phase III r and omized studies providing strong evidence that the efficacy of anti-EGFR mAb is confined to patients with wild-type KRAS mCRC . Investigation of other predictive biomarkers may be useful to further define the responder population 3531 Background : The EXPLORE study is a r and omized phase III study comparing cetuximab ( Erbitux ) plus FOLFOX4 ( 5-FU , leucovorin , oxaliplatin ) , to FOLFOX4 in 2nd-line metastatic , EGFR-positive colorectal cancer ( CRC ) patients ( pts ) . Since this study is the first experience combining cetuximab with FOLFOX4 , an early safety assessment was performed . METHODS The first stage of this r and omized phase III trial restricted accrual to 40 pts at 10 centers to obtain an early assessment of safety . Pts with metastatic EGFR-positive CRC who had progressed on prior first-line irinotecan therapy with an ECOG performance status < 2 were r and omized to either Arm A ( cetuximab 400 mg/m2 initial dose followed by 250 mg/m2 weekly dose and FOLFOX4 d1,d2 q 2 weeks ) or to Arm B ( FOLFOX4 q 2 weeks ) . A pooled analysis of safety is presented . RESULTS Forty patients were r and omized from March to November , 2003 . They included 20 women and 20 men with a mean age of 61 years . Two patients did not receive study therapy and are excluded from the analysis . A total of 178 cycles were administered to the 38 pts with a median of 3 cycles per pt ( range 1 - 15 ) . There have been 4 disease-related deaths and one severe non-fatal hypersensitivity reaction ( HSR ) . A summary table of adverse events associated with chemotherapy and cetuximab is below . The incidence and severity of these events appear comparable to previous reports of cetuximab with chemotherapy or of FOLFOX4 . CONCLUSIONS In this pooled analysis the characteristic toxicities of cetuximab and FOLFOX4 do not appear to be increased . The study is continuing to accrue to its target of 1100 patients . [ Figure : see text ] [ Table : see text ] The authors explored the association of skin toxicity ( ST ) severity as measured by patient‐reported ST and Common Terminology Criteria for Adverse Events ( CTCAE ) grading with efficacy of panitumumab , a fully human antiepidermal growth factor receptor antibody , from a phase 3 metastatic colorectal cancer ( CRC ) trial CONTEXT Leucovorin , fluorouracil , and oxaliplatin ( FOLFOX ) is the st and ard adjuvant therapy for resected stage III colon cancer . Adding cetuximab to FOLFOX benefits patients with metastatic wild-type KRAS but not mutated KRAS colon cancer . OBJECTIVE To assess the potential benefit of cetuximab added to the modified sixth version of the FOLFOX regimen ( mFOLFOX6 ) in patients with resected stage III wild-type KRAS colon cancer . DESIGN , SETTING , AND PARTICIPANTS A r and omized trial of 2686 patients aged 18 years or older at multiple institutions across North America enrolled following resection and informed consent between February 10 , 2004 , and November 25 , 2009 . The primary r and omized comparison was 12 biweekly cycles of mFOLFOX6 with and without cetuximab . KRAS mutation status was central ly determined . The trial was halted after a planned interim analysis of 48 % of predicted events ( 246/515 ) occurring in 1863 ( of 2070 planned ) patients with tumors having wild-type KRAS . A total of 717 patients with mutated KRAS and 106 with indeterminate KRAS were accrued . The 2070 patients with wild-type KRAS provided 90 % power to detect a hazard ratio ( HR ) of 1.33 ( 2-sided α = .05 ) , with planned interim efficacy analyses after 25 % , 50 % , and 75 % of expected relapses . MAIN OUTCOME MEASURES Disease-free survival in patients with wild-type KRAS mutations . Secondary end points included overall survival and toxicity . RESULTS Median ( range ) follow-up was 28 ( 0 - 68 ) months . The trial demonstrated no benefit when adding cetuximab . Three-year disease-free survival for mFOLFOX6 alone was 74.6 % vs 71.5 % with the addition of cetuximab ( HR , 1.21 ; 95 % CI , 0.98 - 1.49 ; P = .08 ) in patients with wild-type KRAS , and 67.1 % vs 65.0 % ( HR , 1.12 ; 95 % CI , 0.86 - 1.46 ; P = .38 ) in patients with mutated KRAS , with no significant benefit in any subgroups assessed . Among all patients , grade 3 or higher adverse events ( 72.5 % vs 52.3 % ; odds ratio [ OR ] , 2.4 ; 95 % CI , 2.1 - 2.8 ; P < .001 ) and failure to complete 12 cycles ( 33 % vs 23 % ; OR , 1.6 ; 95 % CI , 1.4 - 1.9 ; P < .001 ) were significantly higher with cetuximab . Increased toxicity and greater detrimental differences in all outcomes were observed in patients aged 70 years or older . CONCLUSION Among patients with stage III resected colon cancer , the use of cetuximab with adjuvant mFOLFOX6 compared with mFOLFOX6 alone did not result in improved disease-free survival . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00079274 Summary Purpose To assess safety and efficacy of folinic acid , 5-fluorouracil , gemcitabine ( FFG ) and folinic acid , fluorouracil , oxaliplatin ( FOLFOX4 ) regimens with added bevacizumab as first-line treatment in patients with advanced colorectal cancer ( CRC ) . Patients and Methods Patients with Stage III unresectable or Stage IV adenocarcinoma of the colon or rectum were r and omly assigned to either FFG weekly for 6 weeks of an 8-week cycle or FOLFOX4 every 2 weeks . After FDA approval , bevacizumab 5 mg/kg was added every 2 weeks . Treatment continued until disease progression . Planned enrollment was 190 patients . Primary endpoint was overall response rate ( ORR ) ; secondary endpoints included evaluation of adverse events , time to progression ( TTP ) , and overall survival ( OS ) . Disease Control Rate ( DCR ; % of patients with complete or partial responses or stable disease ) was a post hoc analysis . Results The trial was stopped prematurely due to low enrollment . Of 84 enrolled patients ( 42 to each arm ) , 36 patients ( 18 in each arm ) received bevacizumab . ORR was greater ( P = .002 ) for FOLFOX4 ( 17/42 ; 40.5 % ) than for FFG ( 4/42 ; 9.5 % ) ; however , TTP , OS , and DCR results were not statistically different comparing FOLFOX4 and FFG . Peripheral neuropathy was more frequent ( P = < .001 ) with FOLFOX4 ( 18/42 ; 42.9 % ) than with FFG ( 1/42 ; 2.4 % ) . Conclusions FFG and FOLFOX4 were generally well tolerated . Based on ORR , FOLFOX4 was superior to FFG . However , differences in TTP and OS comparing regimens were inconclusive . General use of gemcitabine as a biomodulator of 5-fluorouracil in CRC can not be recommended at this time and the regimen remains investigational PURPOSE We evaluated the safety and efficacy of concurrent administration of two monoclonal antibodies , cetuximab and bevacizumab , in patients with metastatic colorectal cancer . PATIENTS AND METHODS This was a r and omized phase II study in patients with irinotecan-refractory colorectal cancer . All patients were naïve to both bevacizumab and cetuximab . Patients in arm A received irinotecan at the same dose and schedule as last received before study entry , plus cetuximab 400 mg/m2 loading dose , then weekly cetuximab 250 mg/m2 , plus bevacizumab 5 mg/kg administered every other week . Patients in arm B received the same cetuximab and bevacizumab as those in arm A but without irinotecan . RESULTS Forty-three patients received cetuximab , bevacizumab , and irinotecan ( CBI ) and 40 patients received cetuximab and bevacizumab alone ( CB ) . Toxicities were as would have been expected from the single agents . For the CBI arm , time to tumor progression ( TTP ) was 7.3 months and the response rate was 37 % ; for the CB arm , TTP was 4.9 months and the response rate was 20 % . The overall survival for the CBI arm was 14.5 months and the overall survival for the CB-alone arm was 11.4 months . CONCLUSION Cetuximab and bevacizumab can be administered concurrently , with a toxicity pattern that seems to be similar to that which would be expected from the two agents alone . This combination plus irinotecan also seems to be feasible . The activity seen with the addition of bevacizumab to cetuximab , or to cetuximab plus irinotecan , seems to be favorable when compared with historical controls of cetuximab or cetuximab/irinotecan in patients who are naïve to bevacizumab BACKGROUND To date , there is no screening programme for colorectal cancer ( CRC ) in Fl and ers , Belgium . However , The European Code Against Cancer ( 2003 ) recommends a population -based approach for CRC screening . This study aim ed to obtain information about potential participation rates for a population -based screening programme for CRC in Fl and ers , and to compare two invitation strategies . METHODS In 2009 , a trial programme for CRC screening was set up in three Flemish areas for all average-risk people aged 50 - 74 years , using an immunochemical faecal occult blood test ( iFOBT ) with a cut-off value set at 75 ng/ml of haemoglobin . The faecal sampling set was sent at r and om by post ( mail group ) or provided by the general practitioner ( GP group ) . RESULTS In total , 19,542 people were invited to participate . Of these , 8229 provided a faecal sample , result ing in an overall participation rate of 42.1 % . Participation by mail and through the GP was 52.3 % ( 95 % CI , 51.3 - 53.2 ) and 27.7 % ( 95 % CI , 26.7 - 28.6 ) , respectively . The difference of 24.6 % was statistically significant ( 95 % CI , 23.3 - 25.9 , p<0.001 ) . Before the reminder letter was sent and the other invitation strategy was offered , the overall participation rate was 26.5 % ( n=5176 ) ; 36.4 % ( 95 % CI , 35.5 - 37.4 ) for the mail group and 16.6 % ( 95 % CI , 15.8 - 17.3 ) for the GP group . The odds of participating in CRC screening was almost three times higher for people invited by mail as opposed to people invited through a GP ( OR=2.96 , 95 % CI , 2.78 - 3.14 , p<0.001 ) . Women were more likely to participate in CRC screening than men ( OR=1.22 , 95 % CI , 1.15 - 1.30 , p<0.001 ) . In addition , we found that inhabitants from residential ( OR=1.98 , 95 % CI , 1.85 - 2.11 ) and rural ( OR=2.90 , 95 % CI , 2.66 - 3.16 ) areas were more likely to participate than those in urban areas . Of the 8229 people who su bmi tted a faecal sample , 435 ( 5.3 % ) had a positive iFOBT , and of those , CRC was diagnosed in 18 ( 5.7 % ) individuals . Compliance for follow-up colonoscopy was 72.9 % , and did not differ between the mail ( 72.4 % , 95 % CI , 67.5 - 77.3 ) and GP groups ( 74.3 , 95 % CI , 66.2 - 82.5 ) . CONCLUSION Inviting people for CRC screening by means of a direct-mail invitation , and including a faecal sampling set ( iFOBT ) , results in much higher participation rates than inviting people through the GP BACKGROUND Irinotecan and weekly cetuximab ( I+C ) is a st and ard second-line regimen for metastatic colorectal cancer ( mCRC ) . This study investigated the safety and efficacy of every 2 weeks I+C in patients with mCRC . PATIENTS AND METHODS Patients with mCRC refractory to first-line fluoropyrimidine/oxaliplatin regimens and not previously treated with I+C were eligible . Response rate ( RR ) was the primary endpoint . Cetuximab 500 mg/m(2 ) and irinotecan 180 mg/m(2 ) were administered intravenously ( I.V. ) on day 1 every 2 weeks . RESULTS Patient characteristics ( n = 31 ) : male ( n = 17 ) , median age 62 ; Eastern Cooperative Oncology Group ( ECOG ) performance status ( PS ) ≤1 ( n = 30 ) , and PS = 2 ( n = 1 ) . Median number of cycles = 3 ( range , 1 - 22 ) . I+C doses were modified in 18 and 12 patients , respectively . Grade 3/4 adverse events : acneiform rash ( n = 6 ) ; neutropenia ( n = 6 ) ; and diarrhea ( n = 5 ) ; there was one grade 5 respiratory failure , possibly related to therapy . Two patients had a partial response , 11 had stable disease , and 18 had progressive disease result ing in an overall RR of 6 % and disease control rate of 41.9 % . Median overall survival ( OS ) was 9.3 months ( 95 % CI , 5.1 - 15 ) , and time to progression ( TTP ) was 2.4 months ( 95 % CI , 1.3 - 4.6 ) . K-ras and BRAF mutations were detected in 39 % and 9 % , respectively , of the patients tested . There was a trend toward longer TTP among patients with wild-type K-ras and BRAF ( 2.6 vs. 1.7 months ; P = 0.16 ) , and OS was significantly longer in those patients ( 14.1 vs. 5.5 months ; P = 0.04 ) . CONCLUSIONS The RR and TTP were lower than expected and may reflect the reduced dose intensity due to toxicities . While the OS was consistent with previous publications , the efficacy of this combination was not demonstrated The st and ard adjuvant treatment for stage III colon cancer in Europe is the 5-fluorouracil , leucovorin and oxaliplatin ( FOLFOX-4 ) regimen , given for 6 months . Cetuximab , a monoclonal antibody directed against the EGF receptor , appears to be effective and safe when combined with oxaliplatin-based regimens , including FOLFOX-4 , in patients with metastatic colorectal cancer . PETACC-8 , a r and omized , multicenter , European Phase III trial , is comparing the efficacy of cetuximab plus FOLFOX-4 with that of FOLFOX-4 alone in patients with stage III colon cancer . The study began in December 2005 and approximately 2000 patients are to be enrolled in nine European countries . The primary end point is disease-free survival time , analyzed after a minimum follow-up of 3 years per patient . Secondary end points include overall survival , treatment compliance , safety and pharmacogenomic parameters PURPOSE To evaluate the antitumor activity and toxicity of single-agent cetuximab in patients with chemotherapy-refractory colorectal cancer whose tumors express the epidermal growth factor receptor . PATIENTS AND METHODS Phase II , open-label clinical trial . Patients were required to have EGFr expression demonstrated on formalin-fixed paraffin-embedded tumor tissue by immunohistochemical staining before study participation . Patients were required to have received irinotecan , either alone or in a combination regimen , and to have demonstrated clinical failure on this regimen before study entry . Cetuximab was administered weekly by intravenous infusion . The first dose of 400 mg/m(2 ) was given during the course of 2 hours . Subsequent weekly treatments were given at a dose of 250 mg/m(2 ) during the course of 1 hour . RESULTS Fifty-seven eligible patients were treated . All were assessable for toxicity and response . The most commonly encountered grade 3 to 4 adverse events , regardless of relationship to study drug , were an acne-like skin rash , predominantly on the face and upper torso ( 86 % with any grade ; 18 % with grade 3 ) , and a composite of asthenia , fatigue , malaise , or lethargy ( 56 % with any grade , 9 % with grade 3 ) . Two patients ( 3.5 % ) experienced grade 3 allergic reactions requiring discontinuation of study treatment . A third patient experienced a grade 3 allergic reaction that resolved , and the patient continued on the study . Neither diarrhea nor neutropenia were dose limiting in any of the 57 patients treated . Five patients ( 9 % ; 95 % CI , 3 % to 19 % ) achieved a partial response . Twenty-one additional patients had stable disease or minor responses . The median survival in these previously treated patients with chemotherapy-refractory colorectal cancer is 6.4 months . CONCLUSION Cetuximab on this once-weekly schedule has modest activity and is well-tolerated as a single agent in patients with chemotherapy-refractory colorectal cancer whose tumors express the epidermal growth factor receptor . Further studies of cetuximab will evaluate the use of cetuximab in conjunction with first-line and adjuvant treatments for this disease BACKGROUND Cetuximab ( C ) , alone or with irinotecan , demonstrates activity in irinotecan-refractory colorectal cancer ( CRC ) . Activity of 5-fluorouracil ( 5-FU ) , leucovorin ( L ) , and bevacizumab ( B ) , and preliminary data of cetuximab + bevacizumab , and toxicity profiles suggests that FOLF-CB ( 5-FU , L , C+B ) may have activity with a favorable toxicity profile as first-line therapy . METHODS Eligible patients were r and omized at registration to either arm A ( mFOLFOX6-B ) ( modified , 5-FU . L ( folinic acid ) , oxaliplatin ( O ) + bevacizumab ) , administered days 1 and 15 of each 28-day cycle as bevacizumab 5 mg/kg , oxaliplatin 85 mg/m(2 ) , leucovorin 400 mg/m(2 ) , and 5-FU 400 mg/m(2 ) then 1200 mg/m(2)/day for 48 hours , or arm B ( FOLF-CB ) , which included bevacizumab , leucovorin , and 5-FU as in arm A and cetuximab 400 mg/m(2 ) day 1 cycle 1 ; all other weekly cetuximab doses were 250 mg/m(2 ) . RESULTS Two hundred forty-seven patients ( arm A/arm B 124/123 ) were enrolled , and 239 were treated ( 118/121 ) . Twelve-month progression-free survival ( PFS ) was 45%/32 % , objective response rates ( ORR ) ( complete response [ CR ] + partial response [ PR ] ) were 52%/41 % , disease control rates ( CR+PR+stable disease [ SD ] ) were 87%/83 % , and median overall survival ( OS ) was 21/19.5 months , respectively . Grade 3 - 4 neutropenia was higher in arm A ( 28%/7 % ) , as was grade 3 fatigue ( 12%/3 % ) , and grade 3 neuropathy ( 11%/ < 1 % ) , whereas acneiform rash was confined to arm B. Retrospective analysis of KRAS mutational status did not demonstrate KRAS as a meaningful determinant of activity , except in arm B patients with KRAS-mutated tumors , which result ed in inferior PFS . Patient satisfaction favored the control ( mFOLFOX6-B ) . CONCLUSION FOLF-CB was not superior to mFOLFOX6-B in terms of 12-month PFS and ORR , and was not more acceptable to patients . This trial supports the conclusion of other recently reported trials that concurrent cetuximab+bevacizumab should not be routinely used in metastatic CRC
11,070
23,710,959
Conclusion INCS is effective for CRS . Prior sinus surgery and direct sinus delivery enhance the effectiveness of INCS in CRS
Background Published r and omized controlled trials ( RCTs ) on the efficacy of intranasal corticosteroid ( INCS ) in chronic rhinosinusitis ( CRS ) use either nasal delivery ( nasal drop or nasal spray ) or sinus delivery ( sinus catheter or sinus irrigation ) in patients with or without sinus surgery . This influences topical drug delivery and distribution . The effect of these factors on the published results of RCTs is assessed . This systematic review explores the strength of evidence supporting the influence of sinus surgery and delivery methods on the effectiveness of topical steroids in studies for CRS with meta-analyses .
BACKGROUND Topical glucocorticoids are the medical treatment of choice in a majority of patients suffering from nasal polyposis . Fluticasone propionate is a fluorinated steroid reported to be highly effective when used topically in the nose for seasonal and perennial allergic and nonallergic rhinitis . OBJECTIVES To evaluate the efficacy and tolerability of intranasal fluticasone propionate in the treatment of long-st and ing polyposis . METHODS Fifty-five patients with long-st and ing nasal polyposis were treated over a 26-week period with fluticasone propionate aqueous nasal spray 200 micrograms bid , beclomethasone dipropionate aqueous nasal spray 200 micrograms bid or placebo , administered intranasally in an aqueous spray in a double-blind , placebo-controlled parallel-group design at a single center . The primary efficacy endpoint was the physicians ' assessment of symptoms and polyp score . Peak nasal inspiratory flow was performed twice daily and on every visit to evaluate the effect of the corticosteroids on nasal air flow . RESULTS A significant difference in the primary efficacy endpoint between fluticasone propionate aqueous nasal spray and beclomethasone dipropionate aqueous nasal spray compared with placebo was seen after 14 weeks of treatment . This was further verified by the peak nasal inspiratory flow results . There was some evidence of earlier onset in the fluticasone propionate aqueous nasal spray group compared with the beclomethasone dipropionate aqueous nasal spray group after 4 weeks in terms of the primary efficacy endpoint . From the daily record cards patients receiving fluticasone propionate aqueous nasal spray had a significantly higher percentage of days on which they required no rescue medication ( P < .009 ) and a higher percentage of days with an overall nasal blockage score on waking of < 2 ( P < .013 ) when compared with placebo-treated patients . No other statistically significant results were found between the two active compounds . CONCLUSION Fluticasone propionate aqueous nasal spray 200 micrograms bid and beclomethasone dipropionate aqueous nasal spray 200 micrograms bid are effective in treating the symptoms of nasal polyps , with some evidence that fluticasone propionate aqueous nasal spray has a faster onset of action and is tolerated at least as well as beclomethasone dipropionate aqueous nasal spray at the same dose BACKGROUND : Topical nasal steroids such as beclomethasone dipropionate and fluticasone propionate have been widely used in the treatment of rhinitis and polyposis . An increase in infection has occurred with the use of fluticasone propionate after endoscopic polypectomy . OBJECTIVE : The purpose of this study was to determine the prevalence of nasal and paranasal infections with the use of topic nasal steroids after endoscopic polypectomy and to compare the recurrence rates of the polyposis . DESIGN AND SETTING : We conducted a prospect i ve , comparative , open , experimental , longitudinal study at an academic tertiary referral medical center . METHODS : One hundred sixty-two patients in whom endoscopic polypectomy had been indicated were r and omly divided into 3 groups of 54 patients each . The patients from the first group were treated with saline lavage only . Patients from the second group also received fluticasone propionate 400 μg/day in nasal spray after lavage . Patients from the third group received beclomethasone dipropionate 600 μg/day after lavage . The prevalence of infections and recurrence of polyposis was compared in the 3 groups . RESULTS : Three patients , 2 in the placebo group and 1 in the beclomethasone group , developed infections during the first 3 months after surgical procedure . The recurrence of polyps in the group without steroids was 44 % . In contrast , 15 % from the patients treated with fluticasone showed recurrence of polyposis ; furthermore , 26 % of the patients treated with beclomethasone showed recurrence of polypsosis , with a minimum follow-up of 12 months . CONCLUSIONS : The use of nasal steroids does not seem to increase the prevalence of infections after endoscopic polypectomy . ( Otolaryngol Head Neck Surg 2004;130:319–22 . BACKGROUND There is little scientific evidence to support the current practice of using oral glucocorticosteroids and antibiotics to treat patients with chronic rhinosinusitis and nasal polyps . OBJECTIVE We evaluated the effects of oral glucocorticoids and doxycycline on symptoms and objective clinical and biological parameters in patients with chronic rhinosinusitis and nasal polyps . METHODS In a double-blind , placebo-controlled , multicenter trial , we r and omly assigned 47 participants with bilateral nasal polyps to receive either methylprednisolone in decreasing doses ( 32 - 8 mg once daily ) , doxycycline ( 200 mg on the first day , followed by 100 mg once daily ) , or placebo for 20 days . Participants were followed for 12 weeks . Patients were assessed for nasal peak inspiratory flow and symptoms and by nasal endoscopy . Markers of inflammation such as eosinophilic cationic protein ( ECP ) , IL-5 , myeloperoxidase , matrix metalloproteinase 9 , and IgE were measured in nasal secretions . Concentrations of eosinophils , ECP , and soluble IL-5 receptor alpha were measured in peripheral blood sample s. RESULTS Methylprednisolone and doxycycline each significantly decreased nasal polyp size compared with placebo . The effect of methylprednisolone was maximal at week 3 and lasted until week 8 , whereas the effect of doxycycline was moderate but present for 12 weeks . Methylprednisolone significantly reduced levels of ECP , IL-5 , and IgE in nasal secretions , whereas doxycycline significantly reduced levels of myeloperoxidase , ECP , and matrix metalloproteinase 9 in nasal secretions . CONCLUSION This is the first double-blind , placebo-controlled study to show a significant effect of oral methylprednisolone and doxycycline on size of nasal polyps , nasal symptoms , and mucosal and systemic markers of inflammation This double-blind parallel-group study compared the effect of budesonide with placebo , in the prophylaxis of nasal polyp recurrence after evulsion . Seventy-three patients with first time or recurrent polypectomy were enrolled . At revisits 3 and 6 months after evulsion , the budesonide-treated patients had significantly lower polyp scores than the placebo-treated patients . Only patients with recurrent nasal polyposis benefited from the budesonide treatment , whereas no effect was evident in patients with first time evulsion 60 patients , aged 15 - 51 years , with chronic allergic or bacterial maxillary sinusitis , were entered in a controlled , double-blind study comparing the efficacy of endonasal irrigations of tixocortol pivalate (Pivalone)-neomycin and neomycin . The treatment lasted 11 days and was administered once daily . A ventilometric measurement of sinus pressure was performed every two endonasal irrigations to assess treatment efficacy . The percentage of nasal deobstruction was significantly higher with tixocortol pivalate-neomycin than with neomycin alone by the fifth examination ( 9th day ) regardless of the etiology of the sinusitis ( allergic or bacterial ) . After 11 days of treatment , significantly better results were obtained in cases of bacterial sinusitis ( 94 % deobstruction with tixocortol pivalate-neomycin versus 74 % with neomycin ) than in cases of allergic sinusitis ( 69 % deobstruction with tixocortol pivalate-neomycin versus 36 % with neomycin ) OBJECTIVES To investigate the effect of intranasal corticosteroids in the treatment of polyps in patients with severe polyposis listed for surgical treatment and to determine the treatment effect on the progression of the disease . DESIGN A double-blind , r and omized , parallel-group , placebo-controlled , 12-week study at a single center . SETTING A tertiary referral center in London , Engl and . PATIENTS Thirty-four patients with severe polyposis listed for endoscopic surgical treatment . INTERVENTION By r and om allocation , fluticasone propionate aqueous nasal spray ( FPANS ) , 200 microg twice a day ; beclomethasone dipropionate aqueous nasal spray , 200 microg twice a day ; or placebo nasal spray twice a day was administered . Patients received 2 actuations to each nostril in the morning and in the evening . MAIN OUTCOME MEASURES Efficacy end points were the need for polypectomy at the end of treatment , the results of acoustic rhinometry , the polyp score , the peak nasal inspiratory flow rate , and an assessment of symptoms . RESULTS The polyp score was significantly decreased in the FPANS-treated group ( P < or = .01 ) . The nasal cavity volume was significantly increased in both the FPANS-treated group and the group receiving beclomethasone compared with placebo ( P < or = .01 ) at the end of treatment . The percentage change in the mean morning peak nasal inspiratory flow rate was greater in the FPANS-treated group , with a significant effect observed at week 2 ( P = .01 ) . Nasal blockage was significantly decreased in both active groups compared with the group receiving placebo . No significant difference was observed between the treatment groups in the number of patients requiring polypectomy . CONCLUSIONS Fluticasone and beclomethasone aqueous nasal sprays are effective in treating the symptoms of severe nasal polyps . There was some evidence that the group treated with FPANS responded more quickly to intervention and that the magnitude of the response was greater than in the group receiving beclomethasone Background The failure rate for frontal sinusotomy is higher than that of overall endoscopic sinus surgery ( ESS ) . To prevent frontal sinus obstruction , systemic or topical steroids are commonly used , but systemic steroid therapy can cause significant morbidity and topical sprays can not be distributed to the frontal ostium . This study was design ed to determine the efficacy of anatomically directed topical steroid drops in reducing frontal ostium stenosis compared with topical steroid sprays after ESS . Methods A prospect i ve , r and omized , single-blind study was conducted in 43 patients ( 77 nasal cavities ) who had undergone ESS , including frontal sinusotomy . Twenty-one patients ( 39 nasal cavities ) used steroid drops applied with the Mygind technique , and 22 patient 8 nasal cavities ) used steroid sprays for 8 weeks postoperatively . The patency of the frontal ostium was evaluated endoscopically 3 months postoperatively . Results The study included 29 men and 14 women ( mean age , 48.2 years ; range , 19–62 years ) . Endoscopic scores in terms of polypoid change , edema , and scar in the middle meatus and frontal recess were not significantly different between the groups , although the drop group showed a tendency to superior scores when compared with the spray group ( p > 0.05 ) . The frontal sinus patency of the drop group was significantly higher than of the spray group ( p < 0.05 ) . Conclusion Topical steroid drops using the Mygind technique led to a 16 % improvement in frontal sinus patency rates in 3 month after ESS in this study compared with postoperative topical steroid use OBJECTIVE To assess the efficacy and safety of fluticasone propionate administered using OptiNose 's novel delivery device ( Opt-FP ) in subjects with bilateral mild-to-moderate nasal polyposis . METHODS A prospect i ve , multicentre , r and omized , double-blind , placebo-controlled , parallel group study was conducted in adult subjects ( n = 109 ) with mild-to-moderate bilateral nasal polyposis . Subjects received Opt-FP 400 microg or placebo twice daily for 12 weeks . Endpoints included endoscopic assessment of polyp size using Lildholdt 's Scale , peak nasal inspiratory flow ( PNIF ) , symptom scores and use of rescue medication . RESULTS The proportion of subjects with improvement in summed polyp score > or= 1 ( Lildholdt\ 's Scale ) was significantly higher with Opt-FP compared with placebo at 4 , 8 and 12 weeks ( 22 % vs 7 % , p = 0.011 , 43 % vs 7 % , p < 0.001 , 57 % vs 9 % , p < 0.001 ) . After 12 weeks the summed polyp score was reduced by 35 % ( -0.98 vs + 0.23 , p < 0.001 ) . PNIF increased progressively during Opt-FP treatment ( p < 0.05 ) . Combined symptom score , nasal blockage , discomfort , rhinitis symptoms and sense of smell were all significantly improved . Rescue medication use was lower ( 3.1 % vs 22.4 % , p < 0.001 ) . Opt-FP was well tolerated . CONCLUSIONS Fluticasone propionate ( 400 microg b.i.d . ) administered using OptiNose 's breath actuated bi-directional delivery device was an effective and well tolerated treatment for mild-to- moderate bilateral nasal polyposis OBJECTIVE To demonstrate the long-term efficacy of intranasal furosemide , an inhibitor of the sodium chloride cotransporter channel at the basolateral surface of the respiratory epithelial cell , vs no therapeutic intervention vs intranasal mometasone furoate , a corticosteroid , in preventing relapses of chronic hyperplastic sinusitis with nasal polyposis . DESIGN R and omized prospect i ve controlled study . Patients were examined every 6 months during follow-up ( range , 1 - 9 years ) . PATIENTS One hundred seventy patients with bilateral obstructive or minimally obstructive chronic hyperplastic sinusitis with nasal polyposis . INTERVENTION All patients were surgically treated in the ENT Department , University of Siena Medical School . One month after surgery , group 1 patients ( n = 97 ) started treatment with intranasal furosemide , group 2 ( n = 40 ) received no therapeutic treatment , and group 3 ( n = 33 ) were treated with mometasone . MAIN OUTCOME MEASURES Clinical and instrumental evaluation of postoperative outcomes . RESULTS Seventeen ( 17.5 % ) of 97 patients in group 1 , 12 ( 30.0 % ) of 40 patients in group 2 , and 8 ( 24.2 % ) of 33 patients in group 3 experienced nasal polyposis relapses . We noted a prevalence of early-stage relapse in patients treated with furosemide or mometasone , whereas patients who did not receive any treatment experienced more severe grade s of chronic hyperplastic sinusitis with nasal polyposis ( P<.005 ) . CONCLUSION Use of intranasal furosemide represents a valid therapeutic treatment in the prevention of chronic hyperplastic sinusitis with nasal polyposis Background Recurrence of sinonasal polyposis after endoscopic sinus surgery can be difficult to manage . Topical steroid sprays and irrigations may not provide adequate treatment and systemic steroid therapy is limited by side effects . This study was design ed to evaluate the efficacy of steroid-infused carboxymethylcellulose ( CMC ) foam as a treatment for recurrence of chronic rhinosinusitis with nasal polyposis after endoscopic sinus surgery . Methods A prospect i ve cohort study was performed enrolling patients with recurrent sinonasal polyposis after endoscopic sinus surgery . All patients had development of symptomatic polyp disease despite aggressive postoperative topical steroid treatment . The study treatment entailed endoscopic placement of 4 mL of CMC foam hydrated with triamcinolone , 40 mg/mL ( Kenalog 40 ; Bristol-Myers Squibb , New York , NY ) , into the ethmoid cavities bilaterally . Patients were evaluated using videoendoscopy and the Sino-Nasal Outcomes Test 20 ( SNOT-20 ) at three time points : immediately before treatment , 7–14 days after treatment , and 28–35 days after treatment . The videoendoscopies were r and omized and scored in a blinded fashion using a modification of the perioperative sinus endoscopy ( POSE ) scoring system . Results Ten treatments were performed in eight patients ; two patients underwent two treatments each . Mean SNOT-20 score improved at both 1 week and 1 month after treatment ( 2.44 versus 1.65 , p < 0.05 , and 2.44 versus 1.36 , p < 0.01 , respectively ) . Videoendoscopy also revealed improvement when evaluated with the modified POSE score at 1 week ( 11.8 versus 8.2 , p < 0.001 ) , and 1 month ( 11.8 versus 7.9 , p < 0.001 ) . Conclusion Endoscopic placement of steroid-infused CMC foam improves symptoms and endoscopic findings in patients with recurrent sinonasal polyposis after endoscopic sinus surgery One hundred nine patients with chronic rhinosinusitis underwent functional endoscopic sinus surgery . Seventy seven patients had polyposis . The population was studied prospect ively for 5 years postoperatively . Seventy two patients attended the 5 year follow-up visit . At 1 , 2 , 3 , 4 and 5 years of follow-up all outcome measures except olfactory detection thresholds ( visual analogue scores , endoscopic findings , nasal mucociliary clearance times , total nasal volumes ) were significantly improved compared to preoperative baseline values . Olfactory detection thresholds were significantly improved at 1 and 2 years postoperation . Patient symptom scores were improved in a greater percentage of patients than more objective outcome measures . Thirty eight patients required a total of 88 postoperative rescue medication courses with prednisolone and antibiotic . Twelve patients failed the study as they required at least 1 rescue medication course a month for 2 consecutive months . We demonstrated an 89 % 5 year " survival " rate with regards to the risk of failure . The patients were also entered into a r and omised , stratified , prospect i ve , double-blind , placebo controlled study of fluticasone propionate aqueous nasal spray 200 mcg twice daily , commencing 6 weeks after FESS , with a 5 year follow-up . The change in overall visual analogue score was significantly better in the FPANS group at 5 years . The changes in endoscopic oedema and polyp scores and in total nasal volumes were significantly better in the FPANS group at 4 years but not 5 years . Last value carried forward analysis demonstrated that changes in endoscopic polyp score and in total nasal volume was significantly better in the FPANS group at 5 years . Significantly more prednisolone rescue medication courses were prescribed in the placebo group . Of the 12 patients who failed the study , 10 were in the placebo group . This difference nearly achieved significance would be greater than on the softer surface of fells , and were absent during cycling when passive contraction of the psoas would produce less colonic compression . This mechanism may be similar to the repetitive mechanical trauma described to explain contusion of the bladder and haematuria in runners.5 " Jogger 's trots " is a further example of the many hazards that face marathon runners . Although our patient 's symptoms may be controlled with drugs , this seems contrary to the principle of guidelines for the control of abuse of drugs in sport We performed a double‐blind , crossover , placebo‐controlled study on the effect of fluticasone propionate ( FP ) treatment on chronic eosinophilic rhinosinusitis in 15 patients with aspirin‐induced asthma ( AIA ) . There were 10 women and five men aged 32–60 years ; average : 45 years . After a 10‐day run‐in period , patients underwent two 4‐week treatment courses ( FP vs placebo ) , separated by a 2‐week washout interval . Clinical activity of FP was evaluated by daily measurement of peak nasal inspiratory flow ( PNIF ) and a scoring system of subjective symptoms . Nasal challenges with L‐lysine aspirin , using active anterior rhinomanometry , were performed at the entry and on the last day of each treatment period . Weekly mean values of symptom scores were generally lower and PNIF measurements higher during treatment with FP than with placebo . This difference was statistically significant for most recorded parameters for the whole 4‐week FP treatment . On average , the reactions evoked by aspirin nasal challenge were significantly shorter and milder after treatment with FP than with placebo . In 8/13 patients , FP completely prevented aspirin‐precipitated nasal reaction , whereas protection after placebo was observed in only 2/12 subjects ( P=0.004 ) . We conclude that intranasal FP is an effective therapy for chronic eosinophilic rhinitis in patients with AIA Objective Assess paranasal sinus distribution of topical solutions following endoscopic sinus surgery ( ESS ) using various delivery devices . Study Design Experimental prospect i ve study . Subjects and Methods Ten cadaver sinus systems were irrigated with Gastroview before surgery , after ESS , and after medial maxillectomy . Delivery was via pressurized spray ( NasaMist ) , neti pot ( NasaFlo ) , and squeeze bottle ( Sinus Rinse ) . Scans were performed before and after each delivery with a portable CT machine ( Xoran xCAT ) , and blinded assessment s were made for distribution to individual sinuses . Results Total sinus distribution was greater post-ESS ( P < 0.001 ) . Additional distribution was gained with medial maxillectomy ( P = 0.02 ) . Influence of delivery device on distribution was significantly higher with neti pot > squeeze bottle > pressurized spray ( P < 0.001 ) . Frontal sinus penetration was greatest after surgery ( P = 0.001 ) . Conclusion ESS greatly enhances the delivery of nasal solutions , regardless of delivery device . Pressurized spray solutions in un-operated sinuses provide little more than nasal cavity distribution . Use of squeeze bottle/neti pot post-ESS offers a greatly enhanced ability to deliver solutions to the paranasal sinuses Background Chronic rhinosinusitis has a major impact on the quality of life of patients with cystic fibrosis ( CF ) and may contribute to progression of chronic lung disease . Despite multiple sinus surgeries , maxillary sinus involvement is a recurrent problem . The modified endoscopic medial maxillectomy ( MEMM ) permits debridement in the clinic , improves mucus clearance with nasal irrigations , and increases access for topical delivery of therapeutics . However , clinical outcomes of aggressive sinus surgery with regimented postoperative medical treatment have not been systematic ally evaluated . Methods CF patients completed the 22-Item Sinonasal Outcome Test question naires before sinus surgery ( and bilateral MEMM ) and at sequential postoperative visits . Objective measures included Lund-Kennedy endoscopic score and pulmonary function tests ( forced expiratory volume at 1 second percent [ FEV1 % ] predicted ) . Culture-directed antibiotic therapy , prednisone , and topical irrigations were initiated postoperatively . Results Twenty-two patients ( mean age , 26.5 years ; 4.9 prior sinus operations ) underwent MEMM and sinus surgery . Symptom scores were significantly reduced at 60 days ( primary outcome , 64.7 ± 18.4 presurgery versus 27.5 ± 15.3 postsurgery ; p < 0.0001 ) and up to a year postoperatively ( 27.6 ± 12.6 ; p < 0.0001 ) . Endoscopic scores were also reduced after surgery ( 10.4 ± 1.1 presurgery versus 5.7 ± 2.4 [ 30 days ] , 5.7 ± 1.4 [ 60 days ] , 5.8 ± 1.3 [ 120 days ] , and 6.0 ± 1.1 [ 1 year ] ; p < 0.0001 ) ] . There were no differences in FEV1 % predicted up to 1 year postoperatively , but hospital admissions secondary to pulmonary exacerbations significantly decreased ( 2.0 ± 1.4 versus 3.2 ± 2.4 , respectively ; p < 0.05 ) . Conclusion Prospect i ve evaluation indicates sinus surgery with MEMM is associated with marked improvement in sinus disease outcomes . Additional studies are necessary to confirm whether this treatment paradigm is associated with improved CF pulmonary disease Beneficial effects of intranasal beclomethasone dipropionate ( Bdp ) in patients with nasal polyposis have been reported earlier . This study was carried out to investigate whether long-term treatment with Bdp after polypectomy could prevent formation of new polyps and reduce the number of surgical removals . Forty consecutive patients without laboratory or other clinical signs of allergy but with severe nasal polyposis were included in the study . Twenty patients were treated with intranasal Bdp and twenty patients received no treatment after polypectomy . All patients were followed for at least 2.5 years . The size of the polyps that recurred was estimated at different time-intervals by the examining doctor . After six months there was already a significant difference in favour of the group treated with intranasal Bdp . Further results of the study and the clinical implication s are discussed Chronic eosinophilic rhinosinusitis underlies a range of respiratory disorders including nasal polyposis . Surgical and medical methods are used to control polyps , with topical steroids commonly being used for their anti‐inflammatory properties . Fluticasone propionate nasal drops ( FPND ) is a formulation developed specifically for an effective and well tolerated corticosteroid treatment of nasal polyposis OBJECTIVES /HYPOTHESIS Evidence is lacking to guide the postoperative management of Samter 's triad patients with chronic rhinosinusitis with polyposis ( CRSwP ) undergoing endoscopic sinus surgery ( ESS ) . The purpose of this study was to compare three different st and ardized medication regimens prescribed to these patients after ESS . STUDY DESIGN Three-arm , r and omized , double-blinded , controlled trial . METHODS Patients with Samter 's triad undergoing ESS were postoperatively r and omized into three medication regimens , those being saline irrigation alone ( control group A ) , saline irrigation plus separate budesonide nasal spray ( group B ) , and saline irrigation mixed with budesonide nasal spray ( group C ) . Outcome measures were Sino-Nasal Outcome Test scores , Lund-Mackay computed tomography scores , and Lund-Kennedy endoscopic scores taken at preoperative baseline , and then at 6 months and 1 year postoperatively . Side effect profiles were also measured ( adrenocorticotropic hormone blood level ranges and intraocular pressure at the same interval points ) . Analysis of variance and χ(2 ) analyses were conducted using a Bonferroni correction method and routine descriptive statistics . Inter- and intragroup comparisons were made . RESULTS Sixty subjects were recruited . All groups were equivalent at baseline in all outcomes . All intragroup analyses showed statistically and clinical ly significant improvement in disease status as compared to baseline ( P < .0167 ) , with a sustained but lessened improvement at 1 year . However , no statistically or clinical ly significant differences were observed between groups at any time point ( P > .05 ) . There was no treatment effect noted . CONCLUSIONS In this study , nasal steroids did not confer any additional benefit over saline alone as post-ESS care for the Samter 's triad CRSwP patient population OBJECTIVE To assess the efficacy and tolerability of once-daily treatment with budesonide aqueous nasal spray in patients with nasal polyps . DESIGN R and omized , double-blind , placebo-controlled , parallel-group study . SETTING Sixteen hospital clinics . PATIENTS One hundred eighty-three patients with moderate-sized nasal polyps causing clinical ly significant symptoms during a 1-week run-in period . INTERVENTIONS Patients were r and omized to receive 1 of the following 4 budesonide aqueous nasal spray treatments : 128 microg once daily in the morning and placebo in the evening , 128 microg twice daily , 256 microg once daily in the morning and placebo in the evening , or placebo for 8 weeks . Nasal polyp size was scored and peak nasal inspiratory flow was measured at clinic visits at the beginning and end of the run-in period and after 4 and 8 weeks ' treatment . Patients recorded daily peak nasal inspiratory flow , symptom scores ( ie , blocked nose , runny nose , and sneezing ) and sense of smell on diary cards . MAIN OUTCOME MEASURES Mean change in nasal polyp size at the end of treatment ; mean changes in combined and individual symptom scores . RESULTS All doses of budesonide aqueous nasal spray significantly ( P<.01 ) reduced polyp size ; no significant differences were noted between the 4 treatment groups . The mean improvement in clinic peak nasal inspiratory flow at 8 weeks was 65.9 L/min with budesonide aqueous nasal spray , 128 microg twice daily ; 71.6 L/min with budesonide aqueous nasal spray , 256 microg once daily ; and 54.6 L/min with budesonide aqueous nasal spray , 128 microg once daily ( all P<.001 vs placebo ) . Combined and individual symptom scores and sense of smell improved significantly in all budesonide-treated groups ; the effect on symptoms became apparent within 1 to 2 days of the first dose . Budesonide aqueous nasal spray was well tolerated . CONCLUSIONS Doses of budesonide aqueous nasal spray , 128 microg once daily , were found to be effective in the treatment of nasal polyps , and doses of budesonide aqueous nasal spray , 256 microg once daily , did not show any significant additional efficacy The efficacy and safety of fluticasone propionate ( FP ) nasal drops were investigated in two multicentre , r and omized , placebo-controlled trials . Patients received FP 400 microg once or twice daily for 12 weeks and then FP 400 microg once daily for a further 12 weeks . FP 400 microg significantly reduced polyp size and improved peak nasal inspiratory flow , rhinitis symptoms and sense of smell when administered twice daily . Significant reductions in polyp size were not achieved with once daily administration , but clinical benefits were observed for peak nasal inspiratory flow . Both dosing regimens were well tolerated , with an overall incidence of adverse events which was similar to placebo OBJECTIVE Whether instillation into the maxillary sinus of topical budesonide affected the immune response and improved allergic patients with chronic rhinosinusitis that had persistence of symptoms despite appropriate surgical intervention was assessed . STUDY DESIGN Double-blind placebo-controlled . METHODS Twenty-six patients with allergy to house dust mites who had previously had surgery and who had persistent symptoms of disabling rhinorrhea or pressure-pain resistant to oral antibiotics and intranasal corticosteroids were recruited . During the double-blind study , patients instilled 256 microg budesonide daily or placebo through an intubation device ( maxillary antrum sinusotomy tube ) into one of the maxillary sinuses for 3 weeks before clinical assessment and a second biopsy . RESULTS We found an improvement in the symptom scores in 11 of the 13 patients who received budesonide ; we also found a decrease in CD-3 ( P = .02 ) and eosinophils ( P = .002 ) , and a decrease in the density of cells expressing interleukin4 ( P = .0001 ) and interleukin-5 messenger RNA ( P = .006 ) after treatment . CONCLUSION Topical budesonide delivered through a maxillary antrum sinusotomy tube can control chronic rhinosinusitis that persists after surgery This double blind , parallel study compared flunisolide 2 X 25 mcg in each nostril twice daily , with placebo in the prophylaxis of nasal polyposis recurrence after surgery . The treatment lasted for 12 months . The study was conducted according to the recommendations of the Declaration of Helsinki , and the patients gave verbal consent to participate . The study was review ed by the Norwegian Medicines Control Authority . Forty-one patients with first or recurrent polypectomy were enrolled . Thirty-seven patients completed the 12 months ' period . Four patients dropped out prematurely for reasons unrelated to the test drug . Flunisolide was significantly superior to placebo in preventing recurrence of polyps during 6 to 12 months ' treatment , both with respect to number ( p = 0.05 ) and size ( p = 0.03 ) of polyps . Nasal symptoms of sneezing and stuffiness decreased significantly for flunisolide treated patients during treatment . In the placebo group , there was a significant increase in stuffiness throughout the year . For runny nose , there was no difference between the treatments . Six flunisolide patients and 10 placebo patients reported side effects during the one year treatment , transient mild itching being the most common complaint . Three cases of secretion with bloody traces were reported . No patient withdrew for drug related reasons . In this study , flunisolide was significantly more effective than placebo in preventing recurrence of nasal polyposis during one year 's treatment after polypectomy Topical corticosteroids are the accepted medical adjunct to surgery in patients suffering from nasal polyposis . Fluticasone propionate ( FP ) is a potent , topically active corticosteroid which has been formulated as nasal drops specifically for the treatment of polyposis Nasal polyps are commonly treated surgically . Intranasal administration of topical corticosteroids has gained increased acceptance as a treatment alternative . The aim of our study was to compare the efficacy of treatment of two formulations of budesonide with placebo on nasal polyps . At four Danish clinics 138 patients suffering from moderate or severe nasal polyps were r and omized to a twice daily treatment with Rhinocort ® Aqua 128 μg , Rhinocort Turbuhaler ® 140 μg or placebo ( Astra Draco , Sweden ) for 6 weeks . Polyp size ( primary efficacy variable ) , nasal symptoms , sense of smell , and patients ’ overall evaluation of treatment of efficacy were assessed by scores . Polyp size was reduced significantly in both budesonide treated groups compared with placebo , but there was no statistical difference between the two actively treated groups . Patients ’ nasal symptom scores was significantly more reduced in the Aqua compared to the Turbuhaler treated group , and both reduced symptom scores were significantly better compared to placebo . Sense of smell was significantly improved in the actively treated groups compared to placebo . The proportion of patients rating substantial or total control over symptoms after 6 weeks treatment was 60.9 % and 48.2 % in the Aqua and Turbuhaler-treated groups , respectively , which was significantly better compared with 29.8 % in the placebo-treated group . Rhinocort Aqua and Rhinocort Turbuhaler were equally well tolerated Background Local corticosteroids are widely used in the treatment of nasal polyps and chronic rhinosinusitis both before and after nasal surgery . Their efficacy after functional endoscopic sinus surgery ( FESS ) has not been fully established by placebo‐controlled trials We have previously compared different scoring systems for endoscopic staging of nasal polyps . Of the five methods evaluated , we found that two were better than the others with regard to reproducibility and agreement between physicians . One method was lateral imaging , developed by the authors , and the other was a scoring system developed by Lildholdt et al. The main objective of the present study was to compare the sensitivity of these two methods . Another aim was to study the effect on nasal polyposis of topical nasal corticosteroids over a 2-week period . Patients with bilateral nasal polyposis ( n = 100 ) were r and omized to a 2-week treatment with a topical corticosteroid ( budesonide aqueous nasal spray ; 128 w g b.i.d . ) or placebo in a double-blind manner . Nasal symptoms were scored before treatment and after 3 , 7 and 14 days of treatment , and the patients underwent nasal endoscopy at clinical visits . Patients treated with active substance had an improvement in their symptoms , an effect already detectable after 3 days of treatment , compared with those who received placebo . In addition , a statistically significant decrease in polyp size could be registered after 14 days using lateral imaging but not with the other scoring system . In conclusion , lateral imaging was more sensitive and could detect effects earlier than the other scoring system and can be recommended for the endoscopic staging of nasal polyps in clinical studies The objectives of this study were to determine if the size of nasal polyps could be assessed in a reproducible way and to study the effect of budesonide in patients suffering from eosinophilic nasal polyposis with small and medium sized polyps . Ninety-one patients entered the study . Budesonide 200 micrograms b.i.d . ( aerosol or aqua ) and placebo ( aerosol or aqua ) were compared in a 3-month double-blind , multi-centre , r and omized study . Efficacy was measured by an assessment of polyp size , nasal peak-flow index and the patient 's assessment of nasal symptoms . The conclusions were that the size of polyps could be assessed in a reproducible way , and that budesonide delivered either as a water-based suspension or as an aerosol is effective in the treatment of patients with small and medium sized polyps Recently , the YAMIK sinus catheter ( YAMIK ) has been reported to be a useful therapeutic device in the treatment of sinusitis . The present study was conducted to compare its delivery of either a normal saline ( NS ) or a betamethasone solution ( 0.4 mg/ml ) into the paranasal sinuses of 25 patients ( 39 sides ) with chronic sinusitis . The following parameters were evaluated : ( 1 ) subjective nasal clinical symptoms ( nasal discharge , nasal obstruction , postnasal drip and headache ) , ( 2 ) X-ray photographs ( ethmoid and maxillary sinuses ) and ( 3 ) cytokine levels ( IL-1β , IL-8 and TNF-α ) by enzyme-linked immunosorbent assay . The total nasal symptom scores significantly decreased after the first therapy , and the total X-ray photograph scores significantly decreased after therapy with either NS or the betamethasone solution . In both NS and betamethasone patients , the levels of IL-1β and IL-8 had significantly decreased by the 3rd and 2nd weeks after therapy , respectively . In contrast , the TNF-α level decreased after the first therapy with betamethasone solution and remained unchanged after therapy with NS . These findings suggest that evacuation of the pathological effusions in sinuses may exert a beneficial effect by reducing the levels of IL-1β and IL-8 , and we speculate that removal of pathological effusions from the sinuses may provide treatment through different mechanisms than those that occur in treatment with betamethasone This study evaluated the efficacy and tolerability of budesonide in an aqueous nasal spray ( BANS ) in patients with chronic rhinosinusitis . In this double-blind , placebo-controlled , multicentre , parallel-group study , patients ( n = 167 ) with persistent rhinosinusitis symptoms despite 2-weeks ' antibiotic treatment were r and omised to receive BANS 128 micrograms b.i.d . or placebo for 20 weeks . Morning combined symptom scores ( CSS ) in patients receiving BANS decreased by a mean of -1.85 ( 95 % CI -2.27 , -1.43 ) , versus -1.02 ( -1.43 , -0.61 ) in the placebo group ( p = 0.005 ) ; corresponding values for evening CSS were -1.78 ( -2.22 , -1.35 ) and -1.02 ( -1.45 , -0.60 ) , respectively ( p = 0.012 ) . BANS produced significant reductions in nasal congestion and discharge scores , and improved patients ' sense of smell ( morning only ) , versus placebo . Peak nasal inspiratory flow ( PNIF ) increased significantly during BANS treatment . In allergic patients , BANS significantly ( p < 0.001 ) reduced both morning -1.40 ( -2.18 , -0.62 ) and evening -1.37 ( -2.15 , -0.58 ) CSS from baseline versus placebo , but changes in non-allergic patients ( morning : -0.04 [ -0.95 , 0.87 ] ; evening : 0.14 [ -0.81 , 1.09 ] ) were not significant . PNIF was significantly ( p < 0.01 ) increased in both allergic and non-allergic patients from baseline versus placebo . BANS is an effective and well-tolerated treatment for chronic rhinosinusitis 50 patients with chronic mucopurulent rhinosinusitis were r and omly allocated to treatment with nasal sprays of dexamethasone , tramazoline , and neomycin , dexamethasone and tramazoline with no antibiotic , or matched placebo ( propellant alone ) four times daily to both nostrils for 2 weeks . The patients were assessed in a double-blind manner for symptomatic response and improvement in nasal mucociliary clearance , nasal airway resistance , sinus radiographs , and intranasal bacteriology and appearance . Both active preparations ( with antibiotic 14 of 20 patients responded ; without antibiotic 12 of 20 patients responded ) were more effective than the placebo ( 2 of 10 patients responded ) . There was no significant difference in response between the active preparations with and without antibiotic . Thus , in treatment of chronic mucopurulent rhinosinusitis , reduction of the inflammatory response and decongestion make topical antibiotic unnecessary , probably by allowing host clearance mechanisms to recover A r and omized , double-blind , placebo-controlled trial was performed to assess the efficacy of once daily budesonide in patients with nasal polyps . After a 2-week run-in period , 157 patients with symptomatic bilateral nasal polyposis were r and omized to receive budesonide , 140 micrograms once or twice daily or 280 micrograms once daily ( delivered doses ) via Turbuhaler , or placebo for 8 weeks . Polyp size was assessed endoscopically and , in two centres , by magnetic resonance imaging ( MRI ) . Nasal symptoms ( blocked nose , runny nose , sneezing ) were recorded daily , and patients provided an overall assessment of efficacy at the end of the study . Budesonide , 280 micrograms/day ( 280 micrograms o.d . and 140 micrograms twice daily ) , significantly reduced polyp size , compared with placebo , whereas budesonide , 140 micrograms once daily , had no significant effect . Nasal polyp mass score , measured by MRI , was also significantly reduced in patients receiving 280 micrograms/day . All three doses of budesonide significantly reduced symptom scores , and there were no significant differences between the groups . Overall , approximately 70 % of patients receiving budesonide , 280 micrograms/day , reported substantial or total control of symptoms , compared with 45 % of placebo-treated patients . It is concluded that budesonide , 280 micrograms once daily , reduces polyp size and relieves symptoms in patients with nasal polyposis OBJECTIVE /HYPOTHESIS Endoscopic sinus surgery is an accepted treatment for medically recalcitrant chronic rhinosinusitis . Effective saline douching may improve long-term outcomes of chronic rhinosinusitis but is often impaired by postoperative ostial stenosis . The aim of this study is to determine a critical ostial size at which douching solution reliably enters the sinus cavities . STUDY DESIGN Prospect i ve study of consecutive patient cohort . METHODS Seventeen preoperative or well-healed postoperative endoscopic sinus surgery patients were irrigated with 5 mL blue food coloring mixed with 200 mL buffered saline from a squeeze bottle . The degree of sinus penetration , sinus ostial patency , and ostial size were endoscopically determined . RESULTS Sinuses penetrated by blue dye had a significantly larger minimal ostial dimension ( 7.31 mm ; 95 % confidence interval 5.54 - 9.08 ) than those that had no blue dye penetration ( 1.26 mm ; 95 % confidence interval 0.86 - 1.66 ) as determined by Student t test . Chi-square analysis showed that operated sinuses were more likely to be penetrated than nonoperated sinuses ( P = .0016 ) and obstructed sinuses ( P = .0325 ) . Logistic regression showed a 95 % probability of penetration when the minimum ostial dimension is 3.95 mm or greater . CONCLUSIONS Unoperated sinuses or cases with gross sinus ostial obstruction will not be reliably penetrated by sinus irrigant . A 3.95-mm ostial diameter seems to be the minimum size to guarantee penetration in paranasal sinuses to maximize the potential for topical sinus treatment Chronic rhinosinusitis ( CRS ) is a recalcitrant inflammatory process which has a marked detrimental impact on quality of life . At the present there is no cure for this condition , measures are taken to stop progression , and provide symptomatic relief . Topical corticosteroids are commonly prescribed in the management of CRS , but few trials show effectiveness in clinical setting s. We set up a r and omized , double-blind , placebo-controlled trial to study the effectiveness of a topical corticosteroid agent -- fluticasone propionate aqueous nasal spray ( FPANS ) in patients with CRS . We measured symptoms , diary card , and rigid endoscopy scores , acoustic rhinometry , middle meatal swabs , blood tests -- CRP , ESR , WBC , and eosinophil count . Measurements were done at the start of the trial , at 8 weeks , and 16 weeks where possible . Twenty-two patients completed the trial , 9 received FPANS , and 13 had placebo . There was no difference between the 2 groups on all counts . When patients were considered as one group , there was an improvement in the diary card scores ( p = 0.054 ) , comparing baseline to 8 or 16 weeks . There was no evidence that the regular use of topical corticosteroid increased the risk of developing an infection . An important observation was that the topical corticosteroid did not precipitate acute sinusitis . There is compelling evidence that topical corticosteroids down-regulate cytokine expression , and it is likely that a larger , and longer multi-centre trial may prove their efficacy in CRS Numerous studies have postulated the possible benefit of corticosteroids on olfaction in patients with nasal/sinus disease . Twenty-nine patients with bilateral nasal polyps were included in our study using strict selection criteria to reduce other aetiologies of olfactory dysfunction . The University of Pennsylvania Smell Identification Test ( UPSIT ) was performed pre-operatively on the right and left nostrils separately . Following intranasal polypectomy the patients received a six-week course of beclomethasone nasal spray ( Beconase ) to one nostril only , with the other acting as a control . The UPSIT scores were again obtained for each nostril separately . Wilcoxon Signed Rank test revealed no statistically significant difference in UPSIT scores between treated and untreated nostrils ( p = 0.31 ; power 70 per cent ; ES = 0.47 ) . We conclude that topical beclomethasone does not improve olfaction following nasal polypectomy Intranasal budesonide , 400 micrograms two times a day , was evaluated in 36 patients referred for treatment of nasal polyposis . The age range was 20 to 68 years . Polypectomy was done 5.6 ( mean ) times previously . After a 5-week , treatment-free , baseline period , patients were treated in a double-blind fashion with either budesonide or placebo during 4 weeks . After this treatment period , placebo-treated patients started receiving budesonide in an open trial for an additional 4 weeks . The patients rated their nasal symptoms daily . Nasal examinations and nasal inspiratory flow rate ( IFR ) measurements were done at clinic visits . After 3 and 4 weeks of treatment , the response to budesonide was significantly greater than response to placebo . The greater reduction in nasal blockage caused by polyps , observed on physical examination , p = 0.005 , was mirrored by an increase in nasal IFR ( p = 0.0001 ) . Patient rating of the severity and frequency of nasal blockage were reduced more by budesonide than by placebo ( p less than or equal to 0.0005 ) . Switching placebo-treated patients to budesonide treatment result ed in a reduction of nasal blockage ( p less than 0.001 ) and an increase in nasal IFR ( p less than 0.001 ) . The results demonstrate that topical nasal budesonide , 400 micrograms two times a day , is an effective treatment of nasal polyps The clinical efficacy and adverse effects of budesonide administered as a nasal aerosol in addition to sinus washings and erythromycin therapy was assessed by comparison with placebo in a r and omized , double-blind study of 40 patients with chronic or recurrent maxillary sinusitis . Most of the patients had been referred for operative treatment . Corticosteroid therapy , 400 micrograms daily , or placebo was continued for 3 months . Budesonide and antral irrigations reduced nasal symptoms more effectively than placebo , and there was a significantly greater reduction in facial pain and sensitivity in the budesonide group than in the placebo group . During the treatment period , mucosal thickening as evaluated by radiology decreased more clearly in the budesonide group than in the placebo group , but the difference did not reach statistical significance . The most frequently isolated bacteria were Staphylococcus aureus , Staphylococcus epidermidis and Haemophilus influenzae . Only 2 of 20 Haemophilus strains were beta-lactamase producers . The cellular picture was dominated by neutrophils in all secretions . There was no significant difference in clinical outcome between the two groups . Topical steroid therapy did not cause any adverse effects Abstract . The aim of the study was to assess the efficacy and safety of nasal aqueous beclomethasone dipropionate ( BDP ) , 400 µg/day , given via a metered pump in a once-daily or twice-daily regimen following a double-blind , parallel group design over a 12-week period . Adult patients ( n=112 ) with allergic or non-allergic chronic rhinosinusitis recorded their nasal and ocular symptoms for the 7-day run-in period and for the first 4 weeks of treatment . At baseline and after 4 weeks the airways ' resistance via active anterior rhinomanometry and the volume and area section via acoustic rhinometry were measured . Morning serum cortisol was measured at baseline and at week 12 . Adverse events were to be reported at each visit . Of the 112 r and omised patients , three did not enter the ITT analysis and another 13 in total discontinued the treatment . Significant improvements over the baseline were reported in both groups for the primary variable sum of nasal scores ( -53.7 % in the once-daily group and -59.7 in the twice-daily group ) , as well as for each nasal and ocular symptoms , without differences between the groups . Because of a wider variability than expected , the 95 % confidence interval ( C.I. ) for the difference between the least square means exceeded the pre-defined limit of ±10 % of the reference mean . Similar improvements in both groups were also reported for the nasal airway patency 's parameters . The total number of drug-related adverse events was 26 in the once-daily group and 32 in the twice-daily group , with most of the events consisting of local effects at the site of application . No signs of adrenal suppression were observed , and serum morning cortisol values did not significantly change . The once-daily BDP dosing ( 400 µg/day ) therefore has a similar efficacy and safety profile as the same daily dose given in a twice-daily regimen Background Postoperative irrigation after endoscopic sinus surgery and endoscopic modified Lothrop procedure is used to remove nasal crusts and to improve wound healing . To evaluate the optimal application protocol for irrigation of the frontal sinus , a prospect i ve cadaver study was performed . Methods An endoscopic modified Lothrop procedure and complete sphenoethmoidectomy were performed in 19 heads . Each was irrigated with a 1.5 % solution of water and different colors using nasal spray and a squeeze bottle filled with 50 , 100 , and 200 mL. Intensity of local staining and percentage of area were documented using st and ardized videoendoscopy after irrigation in “ bending over the sink ” or “ vertex to floor ” position . Grading was performed by two independent observers for 23 anatomic regions , including the stained circumference of maxillary and frontal ostia . To evaluate the influence of the anatomy , acoustic rhinometry was performed . ANOVA was used to evaluate effects of application methods and head positions using GenStat 8.2 ( Lawes Agricultural Trust , Rothamsted Experimental Station , Harpenden , U.K. ) using an appropriate block structure . Results With regard to the frontal sinus , we were able to show clear superiority of the squeeze bottle technique filled with 200 mL and applied in the “ vertex to floor position . ” Conclusion In a relatively fit and flexible patient the vertex to floor position using a squeeze bottle technique is advocated . There may be some patients , however , for whom this position is not feasible . In these patients “ bending over the sink , ” while inferior to the “ vertex to floor ” position , still ensures some irrigation of the frontal sinus A double-blind parallel trial of budesonide versus placebo was carried out in 19 patients suffering from nasal polyposis . The patients were r and omly assigned to two groups , one receiving a daily dose of 400 micrograms of budesonide and the other a corresponding dose of placebo intranasally for 4 months . The total mean symptom scores and nasal peak flow values during the trial showed a statistically significant difference in favour of budesonide . On rhinoscopy , a clear decrease in mucosal congestion was noticed in the group receiving the active drug . The polyps diminished in number and size in the budesonide group . Mucociliary function , measured by the saccharin test before and after therapy , did not undergo significant changes in either group . Mean plasma cortisol values were of the same order before and after therapy . Local side-effects were mild in both groups , and there was no indication of mucosal drying or crusting attributable to the test solution . Pre- and post-therapy biopsies revealed a pronounced decrease in tissue eosinophilia following budesonide treatment . Neither nasal smears nor biopsies showed signs of morphological changes in the epithelium In a double‐blind trial thirty‐five patients with moderately‐severe nasal polyposis were treated with intranasal beclomethasone dipropionate aerosol for 3 weeks . The dose given ( 400 γg/day ) had only local effect on the symptoms . Judged by diary card scores the nasal symptoms were reduced to 52 % of the pre‐trial level for the whole group . Corrected for the placebo effect the percentage was 68 . The reduction of symptoms was equally apportioned to the three symptoms , sneezing , nasal secretion and blockage BACKGROUND The aim was to investigate the health impact of nasal polyposis with asthma and to study effects of endoscopic sinus surgery ( ESS ) , and addition of fluticasone propionate nasal drops ( FPND ) , on health related quality of life ( HRQoL ) . METHODS Prospect i ve study of 68 patients with nasal polyposis and asthma . Effects were measured with Study 36-Item Short Form ( SF-36 ) . A r and omized , double-blind , placebo-controlled 14-weeks phase measuring additive effects of FPND 400 µg twice daily ( b.i.d . ) was included . RESULTS HRQoL was significantly decreased in both Physical Component Summary , PCS , ( 45 vs 48 , p=0.049 ) and Mental Component Summary , MCS , ( 43 vs 51 , p<0.001 ) vs reference population . ESS significantly improved PCS , ( p=0.027 ) and MCS ( p=0.021 ) after five weeks . We found significant additional benefit of FPND on three domains ( RP , p=0.002 ; VT , p=0.007 ; SF , p=0.002 ) . The increase in HRQoL with FPND reached reference population levels in all domains , as well as in both PCS ( 50 , p=0.003 ) and MCS ( 52 , p=0.002 ) , five weeks after ESS . CONCLUSIONS FPND 400 µg b.i.d . can be added to ESS in order to improve , and to reach population levels of , HRQoL already five weeks post-ESS . Physicians should evaluate HRQoL and consider ESS with nasal steroids early in their treatment of these patients Budesonide has been used for a number of years as a topical nasal corticosteroid in the treatment of nasal allergy and nasal polyps . Recently , a new device for powder insufflation where no constituents or preservatives are included has been developed ( Rhinocort Turbuhaler , Astra Draco AB , Sweden ) . The present investigation was design ed in order to study the efficacy of topical budesonide powder as the only treatment of nasal polyps . A total of 126 patients entered the study . The medical history and clinical recordings included symptoms and signs , a semiquantitative test of smell and measurement of nasal expiratory peak flow index . Medication was either 200 or 400 micrograms of budesonide powder b.i.d . or placebo . After 1 month an overall assessment of treatment efficacy was made to determine whether the treatment had been a success or a failure . The results showed a statistically significant improvement of symptoms and signs in the actively treated groups . The increase in expiratory peak flow index was about 60 % in the actively treated groups as opposed to 16 % in the placebo group . The overall assessment of treatment efficacy showed success in about 82 % of actively treated patients as opposed to about 43 % in the placebo group . It is concluded that budesonide powder is useful in the treatment of nasal polyps Twenty-two patients with nasal polyps completed a double-blind study with a new topical corticosteroid , flunisolide . Treatment during three months after operation gave , in comparison with placebo , a statistically significant effect on the symptom of stuffy nose , and on the sum of scores for stuffy nose , runny nose and sneezing . There was no significant effect on rhinomanometry . Side effects were negligible . Three patients in the placebo-group required a further operation within one year but none in the flunisolide group . Prophylactic treatment with flunisolide can be recommended as a complement to other treatment after surgery of nasal polyps BACKGROUND Steroid treatment is the mainstay of therapy for nasal polyps and rhinosinusitis . Oral steroids have considerable systemic side effects , and nasal sprays do not sufficiently reach the middle meatus , where polyps originate . Nasal drops might be a more useful formula to deliver steroids into the middle meatus . OBJECTIVE We sought to investigate whether treatment with fluticasone propionate nasal drops ( FPNDs ) can reduce the need for surgery , as measured by signs and symptoms of nasal polyposis and chronic rhinosinusitis , in patients who are on the waiting list for functional endoscopic sinus surgery ( FESS ) . METHODS Fifty-four patients ( 28 male ) with severe nasal polyposis , chronic rhinosinusitis , or both indicated for FESS were included in a 12-week , double-blind , placebo-controlled study . Use of intranasal steroid spray was stopped at least 4 weeks before r and omization . Signs and symptoms were recorded before , during , and at the end of the study period . At the end of the study , a computed tomographic scan was performed , and the need for operation was reassessed by using a st and ardized scoring method . RESULTS FESS was no longer required in 13 of 27 patients treated with FPNDs versus 6 of 27 in the placebo group ( P < .05 ) . Six patients from the placebo group dropped out versus 1 from the FPND group . Symptoms of nasal obstruction , rhinorrhea , postnasal drip , and loss of smell were reduced in the FPND group ( P < .05 ) . Peak nasal inspiratory flow scores increased significantly ( P < .01 ) . Polyp volume decreased in the FPND group ( P < .05 ) , and computed tomographic scores improved in both groups ( P < .05 ) . CONCLUSION Treatment with FPNDs in patients indicated for FESS can reduce the need for surgery
11,071
28,530,049
Across all RCTs , there was no indication of harm attributable to HIVST and potential increases in risk-taking behaviour appeared to be minimal . HIVST is associated with increased uptake and frequency of testing in RCTs . Such increases , particularly among those at risk who may not otherwise test , will likely identify more HIV-positive individuals as compared to st and ard testing services alone .
INTRODUCTION HIV self-testing ( HIVST ) is a discreet and convenient way to reach people with HIV who do not know their status , including many who may not otherwise test . To inform World Health Organization ( WHO ) guidance , we assessed the effect of HIVST on uptake and frequency of testing , as well as identification of HIV-positive persons , linkage to care , social harm , and risk behaviour .
Background Achieving higher rates of partner HIV testing and couples testing among pregnant and postpartum women in sub-Saharan Africa is essential for the success of combination HIV prevention , including the prevention of mother-to-child transmission . We aim ed to determine whether providing multiple HIV self-tests to pregnant and postpartum women for secondary distribution is more effective at promoting partner testing and couples testing than conventional strategies based on invitations to clinic-based testing . Methods and Findings We conducted a r and omized trial in Kisumu , Kenya , between June 11 , 2015 , and January 15 , 2016 . Six hundred antenatal and postpartum women aged 18–39 y were r and omized to an HIV self-testing ( HIVST ) group or a comparison group . Participants in the HIVST group were given two oral-fluid-based HIV test kits , instructed on how to use them , and encouraged to distribute a test kit to their male partner or use both kits for testing as a couple . Participants in the comparison group were given an invitation card for clinic-based HIV testing and encouraged to distribute the card to their male partner , a routine practice in many health clinics . The primary outcome was partner testing within 3 mo of enrollment . Among 570 participants analyzed , partner HIV testing was more likely in the HIVST group ( 90.8 % , 258/284 ) than the comparison group ( 51.7 % , 148/286 ; difference = 39.1 % , 95 % CI 32.4 % to 45.8 % , p < 0.001 ) . Couples testing was also more likely in the HIVST group than the comparison group ( 75.4 % versus 33.2 % , difference = 42.1 % , 95 % CI 34.7 % to 49.6 % , p < 0.001 ) . No participants reported intimate partner violence due to HIV testing . This study was limited by self-reported outcomes , a common limitation in many studies involving HIVST due to the private manner in which self-tests are meant to be used . Conclusions Provision of multiple HIV self-tests to women seeking antenatal and postpartum care was successful in promoting partner testing and couples testing . This approach warrants further consideration as countries develop HIVST policies and seek new ways to increase awareness of HIV status among men and promote couples testing . Trial Registration Clinical Trials.gov NCT02386215 BACKGROUND Frequent testing of individuals at high risk of HIV is central to current prevention strategies . We aim ed to determine if HIV self-testing would increase frequency of testing in high-risk gay and bisexual men , with a particular focus on men who delayed testing or had never been tested before . METHODS In this r and omised trial , HIV-negative high-risk gay and bisexual men who reported condomless anal intercourse or more than five male sexual partners in the past 3 months were recruited at three clinical and two community-based sites in Australia . Enrolled participants were r and omly assigned ( 1:1 ) to the intervention ( free HIV self-testing plus facility-based testing ) or st and ard care ( facility-based testing only ) . Participants completed a brief online question naire every 3 months , which collected the number of self-tests used and the number and location of facility-based tests , and HIV testing was subsequently source d from clinical records . The primary outcome of number of HIV tests over 12 months was assessed overall and in two strata : recent ( last test ≤2 years ago ) and non-recent ( > 2 years ago or never tested ) testers . A statistician who was masked to group allocation analysed the data ; analyses included all participants who completed at least one follow-up question naire . After the 12 month follow-up , men in the st and ard care group were offered free self-testing kits for a year . This trial is registered with the Australian New Zeal and Clinical Trials Registry , number ACTRN12613001236785 . FINDINGS Between Dec 1 , 2013 , and Feb 5 , 2015 , 182 men were r and omly assigned to self-testing , and 180 to st and ard care . The analysis population included 178 ( 98 % ) men in the self-testing group ( 174 person-years ) and 165 ( 92 % ) in the st and ard care group ( 162 person-years ) . Overall , men in the self-testing group had 701 HIV tests ( 410 self-tests ; mean 4·0 tests per year ) , and men in the st and ard care group had 313 HIV tests ( mean 1·9 tests per year ) ; rate ratio ( RR ) 2·08 ( 95 % CI 1·82 - 2·38 ; p<0·0001 ) . Among recent testers , men in the self-testing group had 627 tests ( 356 self-tests ; mean 4·2 per year ) , and men in the st and ard care group had 297 tests ( mean 2·1 per year ) ; RR 1·99 ( 1·73 - 2·29 ; p<0·0001 ) . Among non-recent testers , men in the self-testing group had 74 tests ( 54 self-tests ; mean 2·8 per year ) , and men in the st and ard care group had 16 tests ( mean 0·7 per year ) ; RR 3·95 ( 2·30 - 6·78 ; p<0·0001 ) . The mean number of facility-based HIV tests per year was similar in the self-testing and st and ard care groups ( mean 1·7 vs 1·9 per year , respectively ; RR 0·86 , 0·74 - 1·01 ; p=0·074 ) . No serious adverse events were reported during follow-up . INTERPRETATION HIV self-testing result ed in a two times increase in frequency of testing in gay and bisexual men at high risk of infection , and a nearly four times increase in non-recent testers , compared with st and ard care , without reducing the frequency of facility-based HIV testing . HIV self-testing should be made more widely available to help increase testing and earlier diagnosis . FUNDING The National Health and Medical Research Council , Australia In the GRADE approach , r and omized trials start as high- quality evidence and observational studies as low- quality evidence , but both can be rated down if a body of evidence is associated with a high risk of publication bias . Even when individual studies included in best- evidence summaries have a low risk of bias , publication bias can result in substantial overestimates of effect . Authors should suspect publication bias when available evidence comes from a number of small studies , most of which have been commercially funded . A number of approaches based on examination of the pattern of data are available to help assess publication bias . The most popular of these is the funnel plot ; all , however , have substantial limitations . Publication bias is likely frequent , and caution in the face of early results , particularly with small sample size and number of events , is warranted Reaching universal HIV-status awareness is crucial to ensure all HIV-infected patients access antiretroviral treatment ( ART ) and achieve virological suppression . Opportunities for HIV testing could be enhanced by offering self-testing in population s that fear stigma and discrimination when accessing conventional HIV Counselling and Testing ( HCT ) in health care facilities . This qualitative research aims to examine the feasibility and acceptability of unsupervised oral self-testing for home use in an informal settlement of South Africa . Eleven in-depth interviews , two couple interviews , and two focus group discussion s were conducted with seven healthcare workers and thirteen community members . Thematic analysis was done concurrently with data collection . Acceptability to offer home self-testing was demonstrated in this research . Home self-testing might help this population overcome barriers to accepting HCT ; this was particularly expressed in the male and youth groups . Nevertheless , pilot interventions must provide evidence of potential harm related to home self-testing , intensify efforts to offer quality counselling , and ensure linkage to HIV/ART-care following a positive self-test result Background : Self-testing may increase HIV testing and decrease the time people with HIV are unaware of their status , but there is concern that absence of counseling may result in increased HIV risk . Setting : Seattle , Washington . Methods : We r and omly assigned 230 high-risk HIV-negative men who have sex with men to have access to oral fluid HIV self-tests at no cost versus testing as usual for 15 months . The primary outcome was self-reported number of HIV tests during follow-up . To evaluate self-testing 's impact on sexual behavior , we compared the following between arms : non – HIV-concordant condomless anal intercourse and number of male condomless anal intercourse partners in the last 3 months ( measured at 9 and 15 months ) and diagnosis with a bacterial sexually transmitted infection ( STI : early syphilis , gonorrhea , and chlamydial infection ) at the final study visit ( 15 months ) . A post hoc analysis compared the number of STI tests reported during follow-up . Results : Men r and omized to self-testing reported significantly more HIV tests during follow-up ( mean = 5.3 , 95 % confidence interval = 4.7 to 6.0 ) than those r and omized to testing as usual ( 3.6 , 3.2 to 4.0 ; P < 0.0001 ) , representing an average increase of 1.7 tests per participant over 15 months . Men r and omized to self-testing reported using an average of 3.9 self-tests . Self-testing was noninferior with respect to all markers of HIV risk . Men in the self-testing arm reported significantly fewer STI tests during follow-up ( mean = 2.3 , 95 % confidence interval = 1.9 to 2.7 ) than men in the control arm ( 3.2 , 2.8 to 3.6 ; P = 0.0038 ) . Conclusions : Access to free HIV self-testing increased testing frequency among high-risk men who have sex with men and did not impact sexual behavior or STI acquisition IMPORTANCE Self-testing for HIV infection may contribute to early diagnosis of HIV , but without necessarily increasing antiretroviral therapy ( ART ) initiation . OBJECTIVE To investigate whether offering optional home initiation of HIV care after HIV self-testing might increase dem and for ART initiation , compared with HIV self-testing accompanied by facility-based services only . DESIGN , SETTING , AND PARTICIPANTS Cluster r and omized trial conducted in Blantyre , Malawi , between January 30 and November 5 , 2012 , using restricted 1:1 r and omization of 14 community health worker catchment areas . Participants were all adult ( ≥16 years ) residents ( n = 16,660 ) who received access to home HIV self-testing through resident volunteers . This was a second-stage r and omization of clusters allocated to the HIV self-testing group of a parent trial . INTERVENTIONS Clusters were r and omly allocated to facility-based care or optional home initiation of HIV care ( including 2 weeks of ART if eligible ) for participants reporting positive HIV self-test results . MAIN OUTCOMES AND MEASURES The preplanned primary outcome compared between groups the proportion of all adult residents who initiated ART within the first 6 months of HIV self-testing availability . Secondary outcomes were uptake of HIV self-testing , reporting of positive HIV self-test results , and rates of loss from ART at 6 months . RESULTS A significantly greater proportion of adults in the home group initiated ART ( 181/8194 , 2.2 % ) compared with the facility group ( 63/8466 , 0.7 % ; risk ratio [ RR ] , 2.94 , 95 % CI , 2.10 - 4.12 ; P < .001 ) . Uptake of HIV self-testing was high in both the home ( 5287/8194 , 64.9 % ) and facility groups ( 4433/8466 , 52.7 % ; RR , 1.23 ; 95 % CI , 0.96 - 1.58 ; P = .10 ) . Significantly more adults reported positive HIV self-test results in the home group ( 490/8194 [ 6.0 % ] vs the facility group , 278/8466 [ 3.3 % ] ; RR , 1.86 ; 95 % CI , 1.16 - 2.97 ; P = .006 ) . After 6 months , 52 of 181 ART initiators ( 28.7 % ) and 15 of 63 ART initiators ( 23.8 % ) in the home and facility groups , respectively , were lost from ART ( adjusted incidence rate ratio , 1.18 ; 95 % CI , 0.62 - 2.25 , P = .57 ) . CONCLUSIONS AND RELEVANCE Among Malawian adults offered HIV self-testing , optional home initiation of care compared with st and ard HIV care result ed in a significant increase in the proportion of adults initiating ART . TRIAL REGISTRATION clinical trials.gov Identifier : NCT01414413
11,072
25,407,633
Over half of the studies were effective in the short term for significantly reducing body weight and /or BMI ; however , few showed long-term maintenance .
Young adulthood is a high-risk life stage for weight gain . Evidence is needed to translate behavioural approaches into community practice to prevent weight gain in young adults . This systematic review assessed the effectiveness and reporting of external validity components in prevention interventions .
Background A criticism of R and omized Controlled Trials ( RCTs ) in primary care is that they lack external validity , participants being unrepresentative of the wider population . Our aim was to determine whether published primary care-based RCTs report information about how the study sample is assembled , and whether this is associated with RCT characteristics . Methods We review ed RCTs published in four primary care journals in the years 2001–2004 . Main outcomes were : ( 1 ) eligibility fraction ( proportion eligible of those screened ) , ( 2 ) enrolment fraction ( proportion r and omised of those eligible ) , ( 3 ) recruitment fraction ( proportion of potential participants actually r and omised ) , and ( 4 ) number of patients needed to be screened ( NNS ) in order to r and omize one participant . Results A total of 148 RCTs were review ed . One hundred and three trials ( 70 % ) reported the number of individuals assessed by investigators for eligibility , 119 ( 80 % ) reported the number eligible for participation , and all reported the actual number recruited . The median eligibility fraction was 83 % ( IQR 40 % to 100 % ) , and the median enrolment fraction was 74 % ( IQR 49 % to 92 % ) . The median NNS was 2.43 , with some trials reportedly recruiting every patient or practice screened for eligibility , and one trial screening 484 for each patient recruited . We found no association between NNS and journal , trial size , multi- or single-centre , funding source or type of intervention . There may be associations between provision of sufficient recruitment data for the calculation of NNS and funding source and type of intervention . Conclusion RCTs reporting recruitment data in primary care suggest that once screened for eligibility and found to match inclusion criteria patients are likely to be r and omized . This finding needs to be treated with caution as it may represent inadequate identification or reporting of the eligible population . A substantial minority of RCTs did not provide sufficient information about the patient recruitment process The transition to college has been identified as a critical period for increases in overweight status . Overweight college students are at-risk of becoming obese adults , and , thus prevention efforts targeting college age individuals are key to reducing adult obesity rates . The current study evaluated an Internet intervention with first year college students ( N=170 ) r and omly assigned to one of four treatment conditions : 1 ) no treatment , 2 ) 6-week online intervention 3 ) 6-week weight and caloric feedback only ( via email ) , and 4 ) 6-week combined feedback and online intervention . The combined intervention group had lower BMI s at post-testing than the other three groups . This study demonstrated the effectiveness and feasibility of an online intervention to prevent weight gain among college students This study examines the combined effects of caloric restriction on body composition , blood lipid , and satiety in slightly overweight women by varying food density and aerobic exercise . Twenty-three women were r and omly assigned to one of two groups for a four-week weight management program : the high-energy density diet plus exercise ( HDE : n = 12 , 22 ± 2 yrs , 65 ± 7 kg , 164 ± 5 cm , 35 ± 4 % fat ) and low-energy density diet plus exercise ( LDE : n = 11 , 22 ± 1 yrs , 67 ± 7 kg , 161 ± 2 cm , 35 ± 4 % fat ) groups . Subjects maintained a low-calorie diet ( 1,500 kcal/day ) during the program . Isocaloric ( 483 ± 26 for HDE , 487 ± 27 kcal for LDE ) but different weight ( 365 ± 68 for HDE , 814 ± 202 g for LDE ) of lunch was provided . After lunch , they biked at 60 % of maximum capacity for 40 minutes , five times per week . The hunger level was scaled ( 1 : extremely hungry ; 9 : extremely full ) at 17:30 each day . Before and after the program , the subjects ' physical characteristics were measured , and fasting blood sample s were drawn . The daily energy intake was 1,551 ± 259 for HDE and 1,404 ± 150 kcal for LDE ( P > 0.05 ) . After four weeks , the subjects ' weights and % fat decreased for both LDE ( -1.9 kg and -1.5 % , P < 0.05 ) and HDE ( -1.6 kg and -1.4 % , respectively , P < 0.05 ) . The hunger level was significantly higher for HDE ( 2.46 ± 0.28 ) than for LDE ( 3.10 ± 0.26 ) ( P < 0.05 ) . The results suggest that a low-energy density diet is more likely to be tolerated than a high-energy density diet for a weight management program combining a low-calorie diet and exercise , mainly because of a reduced hunger sensation Aim : The aim of the study was to develop and implement an obesity and weight gain prevention program targeted to a high-risk group . Method : Women , 18–28 years old , with at least one severely obese parent , were r and omized to the intervention or control group of the ‘ Health Hunters ’ program . During 1 year of follow-up , the intervention group received an individualized behavioral program focusing on food choice , physical activity and other lifestyle factors . Anthropometric measures , DXA-based body composition and fitness levels were measured at baseline and after 1 year . Self-reported changes in obesity-related behaviors were also assessed . Results : Baseline examinations were conducted in 40 women , of whom 30 completed follow-up examinations 1 year later . Pregnancy was the most common reason for failure to complete the study . Compared to the control group ( which gained weight ) , the intervention group displayed significant improvements in body weight , body mass index , waist circumference , waist-to-hip ratio and self-reported physical activity . Changes in body composition , although not significant , suggested that the intervention tended to be associated with improved body composition . Further analysis of changes in diet and fitness in relation to concurrent weight changes indicated that the strongest ‘ protective ’ associations were for energy percent protein , fiber density and fitness . Conclusion : Pilot data from the Health Hunters obesity prevention program indicates that it is effective in high-risk young women with familial predisposition for obesity OBJECTIVE This paper describes the design and findings of a pilot Mothers In Motion ( P-MIM ) program . DESIGN A r and omized controlled trial that collected data via telephone interviews and finger stick at 3 time points : baseline and 2 and 8 months post-intervention . SETTING Three Special Supplemental Nutrition Program for Women , Infants , and Children ( WIC ) sites in southern Michigan . PARTICIPANTS One hundred and twenty nine overweight and obese African-American and white mothers , 18 - 34 years old . INTERVENTION The 10-week , theory-based , culturally sensitive intervention messages were delivered via a series of 5 chapters on a DVD and complemented by 5 peer support group teleconferences . MAIN OUTCOME MEASURES Dietary fat , fruit , and vegetable intake ; physical activity ; stress ; feelings ; body weight ; and blood glucose . ANALYSIS General linear mixed model was applied to assess treatment effects across 2 and 8 months post-intervention . RESULTS No significant effect sizes were found in primary and secondary outcome variables at 2 and 8 months post-intervention . However , changes in body weight and blood glucose showed apparent trends consistent with the study 's hypotheses . CONCLUSIONS AND IMPLICATION S The P-MIM showed promise for preventing weight gain in low-income overweight and obese women . However , a larger experimental trial is warranted to determine the effectiveness of this intervention Background The literature on changes in health-related quality of life ( HRQOL ) in weight loss studies is inconsistent , and few studies use more than one type of measure . The purpose of the current study was to compare one-year changes in HRQOL as a function of weight change using three different measures : a weight-related measure ( Impact of Weight on Quality of Life-Lite [ IWQOL-Lite ) ] ) and two generic measures ( SF-36 ; EQ-5D ) . Methods Data were obtained from 926 participants ( mean Body Mass Index ( BMI ) ( kg/m2 ) = 35.4 ; 84 % female ; mean age = 49.5 years ) in a placebo-controlled r and omized trial for weight loss . At baseline and one-year , participants completed all three HRQOL measures . HRQOL was compared across weight change categories ( ≥ 5 % and 0–4.9 % gain , 0–4.9 % , 5.0–9.9 % and ≥ 10 % loss ) , using effect sizes . Results The weight-related measure of HRQOL exhibited greater improvements with one-year weight loss than either of the generic instruments , with effect sizes ranging from 0.24 to 0.62 for 5–9.9 % weight reductions and 0.44 to 0.95 for ≥ 10 % reductions . IWQOL-Lite Self-Esteem also showed a small improvement with weight gain . Changes in the two generic measures of HRQOL were inconsistent with each other , and in the case of the SF-36 , variable across domains . For participants gaining ≥ 5 % of weight , the greatest reductions in HRQOL occurred with respect to SF-36 Mental Health , MCS , and Vitality , with effect sizes of -0.82 , -0.70 , and -0.63 respectively . Conclusion This study found differences between weight-related and generic measures of health-related quality of life in a one-year weight loss trial , reflecting the potential value of using more than one measure in a trial . Although weight loss was generally associated with improved IWQOL-Lite , physical SF-36 subscale and EQ-5D scores , a small amount of weight gain was associated with a slight improvement on weight-specific HRQOL and almost no change on the EQ-5D , suggesting the need for further research to more fully study these relationships . We believe our findings have relevance for weight loss patients and obesity clinicians/ research ers in informing them of likely HRQOL outcomes associated with varying amounts of weight loss or gain PURPOSE To describe the theoretical rationale , intervention design , and clinical trial of a two-year weight control intervention for young adults deployed via social and mobile media . METHODS A total of 404 overweight or obese college students from three Southern California universities ( M(age ) = 22 ( ± 4 ) years ; M ( BMI ) = 29 ( ± 2.8 ) ; 70 % female ) were r and omized to participate in the intervention or to receive an informational web-based weight loss program . The intervention is based on behavioral theory and integrates intervention elements across multiple touch points , including Facebook , text messaging , smartphone applications , blogs , and e-mail . Participants are encouraged to seek social support among their friends , self-monitor their weight weekly , post their health behaviors on Facebook , and e-mail their weight loss questions /concerns to a health coach . The intervention is adaptive because new theory-driven and iteratively tailored intervention elements are developed and released over the course of the two-year intervention in response to patterns of use and user feedback . Measures of body mass index , waist circumference , diet , physical activity , sedentary behavior , weight management practice s , smoking , alcohol , sleep , body image , self-esteem , and depression occur at 6 , 12 , 18 , and 24 months . Currently , all participants have been recruited , and all are in the final year of the trial . CONCLUSION Theory-driven , evidence -based strategies for physical activity , sedentary behavior , and dietary intake can be embedded in an intervention using social and mobile technologies to promote healthy weight-related behaviors in young adults Exercise is recommended by public health agencies for weight management ; however , the role of exercise is generally considered secondary to energy restriction . Few studies exist that have verified completion of exercise , measured the energy expenditure of exercise , and prescribed exercise with equivalent energy expenditure across individuals and genders . Objective The objective of this study was to evaluate aerobic exercise , without energy restriction , on weight loss in sedentary overweight and obese men and women . Design and Methods This investigation was a r and omized , controlled , efficacy trial in 141 overweight and obese participants ( body mass index , 31.0 ± 4.6 kg/m2 ; age 22.6 ± 3.9 years ) . Participants were r and omized ( 2:2:1 ratio ) to exercise at either 400 kcal/session or 600 kcal/session or to a non-exercise control . Exercise was supervised , 5 days/week , for 10 months . All participants were instructed to maintain usual ad libitum diets . Due to the efficacy design , completion of ≥ 90 % of exercise sessions was an a priori definition of per protocol , and these participants were included in the analysis . Results Weight loss from baseline to 10 months for the 400 and 600 kcal/session groups was 3.9 ± 4.9 kg ( 4.3 % ) and 5.2 ± 5.6 kg ( 5.7 % ) , respectively compared to weight gain for controls of 0.5 ± 3.5 kg ( 0.5 % ) ( p<0.05 ) . Differences for weight loss from baseline to 10 months between the exercise groups and differences between men and women within groups were not statistically significant . Conclusions Supervised exercise , with equivalent energy expenditure , results in clinical ly significant weight loss with no significant difference between men and women OBJECTIVE To conduct a prospect i ve , longitudinal study examining weight fluctuation and its predictors before and during the first year of college . DESIGN Men ( n = 266 ) and women ( n = 341 ) enrolled at Dartmouth College ( age range : 16 to 26 ; body mass index range : 15.0 to 42.9 ) provided self-reports of weight and height and completed measures of self-esteem , eating habits , interpersonal relationships , exercise patterns , and disordered eating behaviors both in their senior year of high school and either 3 , 6 , or 9 months into college . MAIN OUTCOME MEASURE Self-reported weight was the primary outcome indicator . RESULTS Analyses indicated that both men and women gained a significant amount of weight ( 3.5 and 4.0 pounds , respectively ) . Weight gain occurred before November of the first academic year and was maintained as the year progressed . College freshmen gain weight at a much higher rate than that of average American adults . For men , frequently engaging in exercise predicted weight gain . Having troublesome relationships with parents also predicted weight gain in men , whereas for women , having positive relationships with parents predicted weight gain . CONCLUSION Underst and ing the predictors of early college weight gain may aid in the development of prevention programs OBJECTIVE To examine whether the duration of abdominal obesity determined prospect ively using measured waist circumference ( WC ) is associated with the development of new-onset diabetes independent of the degree of abdominal adiposity . RESEARCH DESIGN AND METHODS The Coronary Artery Risk Development in Young Adults Study is a multicenter , community-based , longitudinal cohort study of 5,115 white and black adults aged 18–30 years in 1985 to 1986 . Years spent abdominally obese were calculated for participants without abdominal obesity ( WC > 102 cm in men and > 88 cm in women ) or diabetes at baseline ( n = 4,092 ) and was based upon repeat measurements conducted 2 , 5 , 7 , 10 , 15 , 20 , and 25 years later . RESULTS Over 25 years , 392 participants developed incident diabetes . Overall , following adjustment for demographics , family history of diabetes , study center , and time varying WC , energy intake , physical activity , smoking , and alcohol , each additional year of abdominal obesity was associated with a 4 % higher risk of developing diabetes [ hazard ratio ( HR ) 1.04 ( 95 % CI 1.02–1.07 ) ] . However , a quadratic model best represented the data . HRs for 0 , 1–5 , 6–10 , 11–15 , 16–20 , and > 20 years of abdominal obesity were 1.00 ( referent ) , 2.06 ( 1.43–2.98 ) , 3.45 ( 2.28–5.22 ) , 3.43 ( 2.28–5.22 ) , 2.80 ( 1.73–4.54 ) , and 2.91 ( 1.60–5.29 ) , respectively ; P-quadratic < 0.001 . CONCLUSIONS Longer duration of abdominal obesity was associated with substantially higher risk for diabetes independent of the degree of abdominal adiposity . Preventing or at least delaying the onset of abdominal obesity in young adulthood may lower the risk of developing diabetes through middle age Background Recruitment and retention are key functions for programs promoting nutrition and other lifestyle behavioral changes in low-income population s. This paper describes strategies for recruitment and retention and presents predictors of early ( two-month post intervention ) and late ( eight-month post intervention ) dropout ( non retention ) and overall retention among young , low-income overweight and obese mothers participating in a community-based r and omized pilot trial called Mothers In Motion . Methods Low-income overweight and obese African American and white mothers ages 18 to 34 were recruited from the Special Supplemental Nutrition Program for Women , Infants , and Children in southern Michigan . Participants ( n = 129 ) were r and omly assigned to an intervention ( n = 64 ) or control ( n = 65 ) group according to a stratification procedure to equalize representation in two racial groups ( African American and white ) and three body mass index categories ( 25.0 - 29.9 kg/m2 , 30.0 - 34.9 kg/m2 , and 35.0 - 39.9 kg/m2 ) . The 10-week theory-based culturally sensitive intervention focused on healthy eating , physical activity , and stress management messages that were delivered via an interactive DVD and reinforced by five peer-support group teleconferences . Forward stepwise multiple logistic regression was performed to examine whether dietary fat , fruit and vegetable intake behaviors , physical activity , perceived stress , positive and negative affect , depression , and race predicted dropout as data were collected two-month and eight-month after the active intervention phase . Results Trained personnel were successful in recruiting subjects . Increased level of depression was a predictor of early dropout ( odds ratio = 1.04 ; 95 % CI = 1.00 , 1.08 ; p = 0.03 ) . Greater stress predicted late dropout ( odds ratio = 0.20 ; 95 % CI = 0.00 , 0.37 ; p = 0.01 ) . Dietary fat , fruit , and vegetable intake behaviors , physical activity , positive and negative affect , and race were not associated with either early or late dropout . Less negative affect was a marginal predictor of participant retention ( odds ratio = 0.57 ; 95 % CI = 0.31 , 1.03 ; p = 0.06 ) . Conclusion Dropout rates in this study were higher for participants who reported higher levels of depression and stress . Trial registration Current Controlled Trials The college transition represents a critical period for maintaining a healthy weight , yet intervention participation and retention represent significant challenges . The objective of this investigation was to evaluate the preliminary efficacy and acceptability of two interventions to prevent freshman weight gain . One intervention provided opportunities to improve outcome expectations and self-efficacy within a social cognitive theory framework ( SCT ) , while the other targeted the same variables but focused on explicit training in self-regulation skills ( SCTSR ) . Methods . Freshmen ( n = 45 ) aged > 18 years were r and omized to a 14-week intervention , SCT or SCTSR ; both included online modules and in-class meetings . Of the 45 students r and omized , 5 withdrew before the classes began and 39 completed pre- and posttesting . Primary outcomes included body weight/composition , health behaviors , and program acceptability . Analyses included independent sample t-tests , repeated measures ANOVA , and bivariate correlational analyses . Results . Body weight increased over the 14-week period , but there was no group difference . Percent body fat increased in SCTSR but not SCT ( mean difference : SCTSR , + 1.63 ± 0.52 % ; SCT , −0.25 ± 0.45 % ; P = 0.01 ) . Class attendance was 100 % ( SCTSR ) and 98 % ( SCT ) ; SCTSR students ( > 50 % ) remarked that the online tracking required “ too much time . ” Conclusions . The intervention was well received , although there were no improvements in weight outcomes PURPOSE To investigate whether fast food consumption and breakfast skipping are associated with weight gain during the transition from adolescence to adulthood . METHODS A prospect i ve study of 9919 adolescents participating in Waves II ( age range 11 - 21 years ) and III ( age range 18 - 27 years ) of the National Longitudinal Study of Adolescent Health . BMI z scores ( z BMI ) were computed using the 2000 Centers for Disease Control and Prevention growth charts . Multivariate regression models assessed the relationship between Wave II fast food and breakfast consumption and change in fast food and breakfast consumption between Waves II and III and weight gain during the transition to adulthood . RESULTS Marked increases in fast food consumption and decreases in breakfast consumption occurred over the 5-year interval . Greater days of fast food consumption at Wave II predicted increased z BMI at Wave III . Fewer days of breakfast consumption at Wave II and decreases in breakfast consumption between Waves II and III predicted increased z BMI at Wave III . CONCLUSIONS Fast food consumption and breakfast skipping increased during the transition to adulthood , and both dietary behaviors are associated with increased weight gain from adolescence to adulthood . These behaviors may be appropriate targets for intervention during this important transition Objective : To determine the effects of a 15-week high-intensity intermittent exercise ( HIIE ) program on subcutaneous and trunk fat and insulin resistance of young women . Design and procedures : Subjects were r and omly assigned to one of the three groups : HIIE ( n=15 ) , steady-state exercise ( SSE ; n=15 ) or control ( CONT ; n=15 ) . HIIE and SSE groups underwent a 15-week exercise intervention . Subjects : Forty-five women with a mean BMI of 23.2±2.0 kg m−2 and age of 20.2±2.0 years . Results : Both exercise groups demonstrated a significant improvement ( P<0.05 ) in cardiovascular fitness . However , only the HIIE group had a significant reduction in total body mass ( TBM ) , fat mass ( FM ) , trunk fat and fasting plasma insulin levels . There was significant fat loss ( P<0.05 ) in legs compared to arms in the HIIE group only . Lean compared to overweight women lost less fat after HIIE . Decreases in leptin concentrations were negatively correlated with increases in VO2peak ( r=−0.57 , P<0.05 ) and positively correlated with decreases in TBM ( r=0.47 ; P<0.0001 ) . There was no significant change in adiponectin levels after training . Conclusions : HIIE three times per week for 15 weeks compared to the same frequency of SSE exercise was associated with significant reductions in total body fat , subcutaneous leg and trunk fat , and insulin resistance in young women PURPOSE This article describes the effectiveness of goal setting instruction in the CHOICES ( Choosing Healthy Options in College Environments and Setting s ) study , an intervention evaluating the effectiveness of weight gain prevention strategies for 2-year college students . METHODS Four hundred and forty-one participants from three community colleges were recruited . Participants r and omized into the intervention ( n=224 ) enrolled in a course that taught strategies to help maintain or achieve a healthy weight . Participants were instructed in SMART ( Specific , Measurable , Attainable , Realistic , Time-based ) and behavioral goal - setting practice s. Throughout the course , participants set goals related to improving their sleep , stress-management , exercise , and nutrition . " RESULTS Intervention participants set four hundred eighteen goals . Each goal was carefully evaluated . The efforts to teach behavioral goal - setting strategies were largely successful ; however efforts to convey the intricacies of SMART goal - setting were not as successful . CONCLUSIONS Implication s for effective teaching of skills in setting SMART behavioral goals were realized in this study . The insights gained from the goal - setting activities of this study could be used to guide educators who utilize goals to achieve health behavior change . RECOMMENDATIONS Based on the results of this study , it is recommended that very clear and directed instruction be provided in addition to multiple opportunities for goal - setting practice . Implication s for future interventions involving education about goal - setting activities are discussed The objective of this study was to test the hypothesis that a nutrition course that stresses fundamental principles of human physiology , energy metabolism , and genetics helps prevent weight gain during the first 16 months of college life . A r and omized control trial was conducted from January 1997 to May 1998 using volunteers . Forty female college freshmen participated in the intervention ( college course , n = 21 ) and control ( no course , n = 19 ) groups . The intervention was a one-semester nutrition science college course . Body weight , nutrient intakes , and knowledge were measured at baseline , the end of the intervention ( 4 months from baseline ) , and 1 year later ( 16 months from baseline ) . Statistical analysis was conducted using a repeated-measure analysis of variance . Higher Body Mass Index ( BMI ) students ( BMI > 24 ) in the intervention group ( n = 11 ) reported lower fat ( p = .04 ) , protein ( p = .03 ) , and carbohydrate ( p = .008 ) intakes compared with the higher BMI students in the control group ( n = 6 ) . Dietary changes reported by the higher BMI intervention students were associated with the maintenance of baseline body weight for 1 year in contrast with the higher BMI control students who gained 9.2 6.8 kg ( p = .012 ) . The findings suggest that nutrition education emphasizing human physiology and energy metabolism is an effective strategy to prevent weight gain in at-risk college students OBJECTIVE Evaluate a selective prevention program targeting both eating disorder symptoms and unhealthy weight gain in young women . METHOD Female college students at high-risk for these outcomes by virtue of body image concerns ( N = 398 ; M age = 18.4 years , SD = 0.6 ) were r and omized to the Healthy Weight group-based 4-hr prevention program , which promotes gradual lasting healthy improvements to dietary intake and physical activity , or an educational brochure control condition . RESULTS Compared to controls , intervention participants showed significantly greater reductions in body dissatisfaction and eating disorder symptoms , and greater increases in physical activity , at posttest and significantly greater reductions in body mass index ( BMI ) and self-reported dieting at 6-month follow-up . Moderator analyses revealed significantly greater reductions in eating disorder symptoms for those with initially elevated symptoms and pressure to be thin and significantly greater reductions in BMI for those with initially elevated eating disorder symptoms . CONCLUSIONS Results indicate that this intervention reduced both eating disorder symptoms and unhealthy weight gain , but suggest it should be improved to produce stronger and more persistent effects , and that it may be useful to target young women with both body image and eating disturbances OBJECTIVE : To determine if treatment format will affect the willingness of women aged 25–34 to participate in a program for primary prevention of weight gain . DESIGN : 102 normal-weight women aged 25–34 were r and omized to one of three treatment formats ( group meetings , correspondence course , no-treatment control ) . Acceptability was evaluated by determining the proportion of women participating in their assigned format . Efficacy was assessed by determining mean weight changes at post-treatment ( 10 weeks ) and 6-month follow-up , and the proportions of women who remained at baseline weights . RESULTS : Significantly fewer women chose to participate in a group format , compared to the correspondence course and no-treatment control ( 42 % , 84 % and 62 % , respectively ) . However , the group format produced the largest short-term changes in weight ( −1.9±1.8 kg , −1.1±2.1 kg and −0.2±1.3 kg , respectively ) . CONCLUSIONS : The format of prevention programs may influence the willingness of subjects to participate , as well as treatment outcome . Both format acceptability and efficacy should be considered in determining the overall effectiveness of a program Objective : Prevention would be the ideal public health strategy to face the current obesity epidemic . Adoption of healthy lifestyles during the first years of college or university could prevent the onset of weight gain associated with this period of acquired independence and eventually decrease the incidence of obesity . Design : R and omized-controlled trial over a period of 2 years . The subjects received an educational/behavioral intervention ( small group seminars ) design ed to help maintain a healthy lifestyle or no specific intervention ( control group).Subjects : One-hundred and fifteen non-obese freshmen in a Faculty of Medicine . Measurements : Anthropometric measurements , physical activity level , fitness level , food intake and lipid profile were recorded at predetermined intervals . Results : The control group gained weight , whereas the intervention group lost a slight amount of weight over 2 years . The difference between the two groups was 1.3 kg at the end of the follow-up , the trend of weight gain differing between the two groups during the 2-year intervention period ( P=0.04 ) . There was no detectable difference in fitness , physical activity level or total caloric intake between the two groups during follow-up . However , plasma triglyceride levels increased in the control group and decreased in the intervention group ( P=0.04 ) . Conclusion : In this r and omized-controlled trial , a small-group seminar educational/behavioral intervention successfully prevents weight gain in normal weight young healthy university students . Such small absolute changes in body composition and lipid profile , if maintained over a prolonged period , could result in significant long-term health benefits for the general population ( Clinical Trial.gov registration number : NCT00306449 ) OBJECTIVE Weight gain occurs frequently in men aged 25 - 40 . This study compared the effectiveness of a clinic-based and a home-based intervention with a no-treatment control group in preventing this weight gain . RESEARCH METHODS AND PROCEDURES Men (n=67)-aged 25 to 40 , sedentary , with a body mass index of 22 to 30 , recruited from the University of Pittsburgh-were r and omly assigned to 4-month treatments focused on increasing aerobic exercise and reducing fat intake through a clinic-based ( CB ) or a home-based ( HB ) program , or to a delayed-treatment control group . Subjects were reassessed at 4 months . RESULTS Adherence and outcome did not differ significantly between the CB and HB programs , except that CB subjects recorded their food intake more frequently , and a greater number of CB subjects achieved a total of 120 miles of exercise over the 4 months . Subjects in the two intervention conditions combined lost significantly more weight ( -1.6+/-2.5 kg ) than control subjects , who gained 0.2+/-1.9 kg ( p<0.01 ) ; this effect of treatment was seen primarily in men with a body mass index of 27 to 30 ( -2.7 kg for CB and HB combined vs. + 1.5 kg for control ) . Treated subjects also had somewhat greater improvements in body composition , aerobic fitness , and weekly energy expenditure than controls , although these differences did not reach significance . DISCUSSION S Both CB and HB intervention show promise in preventing weight gain in young men , especially in those who are slightly overweight . Larger studies , using more representative sample s of young men , appear warranted Purpose . To identify impact of an online nutrition and physical activity program for college students . Design . R and omized , controlled trial using online question naires and on-site physical and fitness assessment s with measurement intervals of 0 ( baseline ) , 3 ( postintervention ) , and 15 months ( follow-up ) . Setting . Online intervention delivered to college students ; a central ized Web site was used for recruitment , data collection , data management , and intervention delivery . Subjects . College students ( 18–24 years old , n = 1689 ) , from eight universities ( Michigan State University , South Dakota State University , Syracuse University , The Pennsylvania State University , Tuskegee University , University of Rhode Isl and , University of Maine , and University of Wisconsin ) . Intervention . A 10-lesson curriculum focusing on healthful eating and physical activity , stressing nondieting principles such as size acceptance and eating competence ( software developer : Rainstorm , Inc , Orono , Maine ) . Measures . Measurements included anthropometrics , cardiorespiratory fitness , fruit/vegetable ( FV ) intake , eating competence , physical activity , and psychosocial stress . Analysis . Repeated measures analysis of variance for outcome variables . Results . Most subjects were white , undergraduate females ( 63 % ) , with 25 % either overweight or obese . Treatment group completion rate for the curriculum was 84 % . Over 15 months , the treatment group had significantly higher FV intake ( + .5 cups/d ) and physical activity participation ( + 270 metabolic equivalent minutes per week ) than controls . For both groups , anthropometric values and stress increased , and fitness levels decreased . Gender differences were present for most variables . First-year males and females gained more weight than participants in other school years . Conclusion . A 10-week online nutrition and physical activity intervention to encourage competence in making healthful food and eating decisions had a positive , lasting effect on FV intake and maintained baseline levels of physical activity in a population that otherwise experiences significant declines in these healthful behaviors BACKGROUND In light of the current obesity epidemic , treatment models are needed that can prevent weight gain or provide weight loss . We examined the long-term effects of a supervised program of moderate-intensity exercise on body weight and composition in previously sedentary , overweight and moderately obese men and women . We hypothesized that a 16-month program of verified exercise would prevent weight gain or provide weight loss in the exercise group compared with controls . METHODS This was a r and omized controlled efficacy trial . Participants were recruited from 2 midwestern universities and their surrounding communities . One hundred thirty-one participants were r and omized to exercise or control groups , and 74 completed the intervention and all laboratory testing . Exercise was supervised , and the level of energy expenditure of exercise was measured . Controls remained sedentary . All participants maintained ad libitum diets . RESULTS Exercise prevented weight gain in women and produced weight loss in men . Men in the exercise group had significant mean + /- SD decreases in weight ( 5.2 + /- 4.7 kg ) , body mass index ( calculated as weight in kilograms divided by the square of height in meters ) ( 1.6 + /- 1.4 ) , and fat mass ( 4.9 + /- 4.4 kg ) compared with controls . Women in the exercise group maintained baseline weight , body mass index , and fat mass , and controls showed significant mean + /- SD increases in body mass index ( 1.1 + /- 2.0 ) , weight ( 2.9 + /- 5.5 kg ) , and fat mass ( 2.1 + /- 4.8 kg ) at 16 months . No significant changes occurred in fat-free mass in either men or women ; however , both had significantly reduced visceral fat . CONCLUSIONS Moderate-intensity exercise sustained for 16 months is effective for weight management in young adults CONTEXT Weight gain in young adults is an important public health problem and few interventions have been successful . BACKGROUND This pilot study evaluated the preliminary efficacy of two self-regulation approaches to weight-gain prevention : Small Changes ( changes in energy balance of roughly 200 kcal/day ) and Large Changes ( initial weight loss of 5 - 10 lbs to buffer against future weight gains ) . INTERVENTION Participants were enrolled in 8-week programs teaching Small or Large Changes ( SC ; LC ) . Both approaches were presented in a self-regulation framework , emphasizing daily self-weighing . DESIGN R and omized controlled pilot study . SETTING / PARTICIPANTS Young adults ( N=52 ) aged 18 - 35 years ( 25.6+/-4.7 years , BMI of 26.7+/-2.4 kg/m(2 ) ) were recruited in Providence RI and Chapel Hill NC . MAIN OUTCOME MEASURES Adherence to intervention , weight change , and satisfaction/confidence in approach assessed at 0 , 8 , and 16 weeks . Data were collected in 2008 and analyzed in 2008 - 2009 . RESULTS Participants attended 84 % of sessions , and 86.5 % and 84.5 % of participants completed post-treatment and follow-up assessment s , respectively . Participants adhered to their prescriptions . Daily weighing increased markedly in both groups , whereas the eating and exercise changes observed in the SC and LC reflected the specific approach taught . Weight changes were significantly different between groups at 8 weeks ( SC= -0.68+/-1.5 kg , LC= -3.2+/-2.5 kg , p<0.001 ) and 16 weeks ( SC= -1.5+/-1.8 kg , LC= -3.5+/-3.1 kg , p=0.006 ) . Participants in both groups reported high levels of satisfaction and confidence in the efficacy of the approach they were taught . CONCLUSIONS Both Small and Large Change approaches hold promise for weight-gain prevention in young adults ; a fully powered trial comparing the long-term efficacy of these approaches is warranted Background Young adults are at risk for weight gain . Little is known about how to design weight control programs to meet the needs of young adults and few theory-based interventions have been evaluated in a r and omized control trial . The Choosing Healthy Options in College Environments and Setting s ( CHOICES ) study was funded to create a technology-based program for 2-year community college students to help prevent unhealthy weight gain . The purpose of this article is to ( 1 ) provide a brief background on weight-related interventions in young adults ; ( 2 ) describe the study design for the CHOICES study , the conceptual model guiding the research and the CHOICES intervention ; and ( 3 ) discuss implication s of this research for health educators . Translation to Health Education Practice Our experiences from the CHOICES study will be useful in suggesting other theory-based models and intervention strategies that might be helpful in programs attempting to prevent unhealthy weight gain in young adults . In addition , this article discusses important considerations for working with 2-year colleges on this type of health promotion work BACKGROUND Today 's generation of young adults are gaining weight faster than their parents ; however , there remains insufficient evidence to inform interventions to prevent this weight gain . Mobile phones are a popular means of communication that may provide a convenient , inexpensive means to deliver health intervention programmes . This pilot study aim ed to measure the effect of a 12-week mobile health ( mHealth ) intervention on body weight , body mass index and specific lifestyle behaviours addressed by the programme . METHODS University students and staff aged 18 - 35 years ( n = 51 ) were r and omised ( ratio 1 : 1 , intervention : control ) . Both groups received a printed diet booklet with instructions prepared by a dietitian . The intervention group also received Short Message Service ( SMS ) text messages ( four per week ) , e-mails ( four per week ) , and had access to smartphone applications and Internet forums . RESULTS Pre- to post-intervention , participants in the intervention group decreased their body weight [ mean ( SD ) ] [ -1.6 ( 2.6 ) kg ] , increased their light intensity activity [ 34 ( 35 ) min day(-1 ) ] and reported an increased vegetable ( 1.0 median serving day(-1 ) ) and decreased sugar-sweetened beverage intake [ -355 ( 836 ) mL week(-1 ) ] . Despite this , post-intervention changes in outcomes were not significantly different from controls . CONCLUSIONS The piloted mHealth programme provided some short-term positive changes in weight , nutrition and physical activity using a low cost , convenient delivery method for this population . However , changes were no different from those observed among controls . This might partly be explained by intervention participants ' low engagement with the programme , which is likely to require further modification to provide more regular , personalised , monitored support ABSTRACT Young adults are at risk for weight gain in the transition to independent adulthood ; 2-year college students are at greater risk and understudied relative to 4-year students . This project conducted formative research for a r and omized controlled weight gain prevention trial among 2-year college students , to ensure appropriateness of content and delivery of a curriculum originally developed for 4-year college students . Data were collected from community college students , faculty , and staff from October 2009 to August 2011 . Work included focus groups and key informant interviews , curriculum pilot testing , and social network and support website beta testing . Based on focus groups and interviews , program content , course delivery modes , and communication channels were adjusted to meet population interests and preferences . The course was delivered successfully in pilot testing , and the website was received well by beta testers . Formative work successfully guided program adaptations to address population needs UNLABELLED Long-term resistance training ( RT ) may result in a chronic increase in 24-h energy expenditure ( EE ) and fat oxidation to a level sufficient to assist in maintaining energy balance and preventing weight gain . However , the impact of a minimal RT program on these parameters in an overweight college-aged population , a group at high risk for developing obesity , is unknown . PURPOSE We aim ed to evaluate the effect of 6 months of supervised minimal RT in previously sedentary , overweight ( mean + /- SEM , BMI = 27.7 + /- 0.5 kg x m(-2 ) ) young adults ( 21.0 + /- 0.5 yr ) on 24-h EE , resting metabolic rate ( RMR ) , sleep metabolic rate ( SMR ) , and substrate oxidation using whole-room indirect calorimetry 72 h after the last RT session . METHODS Participants were r and omized to RT ( one set , 3 d x wk(-1 ) , three to six repetition maximums , nine exercises ; N = 22 ) or control ( C , N = 17 ) groups and completed all assessment s at baseline and at 6 months . RESULTS There was a significant ( P < 0.05 ) increase in 24-h EE in the RT ( 527 + /- 220 kJ x d(-1 ) ) and C ( 270 + /- 168 kJ x d(-1 ) ) groups ; however , the difference between groups was not significant ( P = 0.30 ) . Twenty-four hours of fat oxidation ( g x d(-1 ) ) was not altered after RT ; however , reductions in RT assessed during both rest ( P < 0.05 ) and sleep ( P < 0.05 ) suggested increased fat oxidation in RT compared with C during these periods . SMR ( 8.4 + /- 8.6 % ) and RMR ( 7.4 + /- 8.7 % ) increased significantly in RT ( P < 0.001 ) but not in C , result ing in significant ( P < 0.001 ) between-group differences for SMR with a trend for significant ( P = 0.07 ) between-group differences for RMR . CONCLUSION A minimal RT program that required little time to complete ( 11min per session ) result ed in a chronic increase in energy expenditure . This adaptation in energy expenditure may have a favorable impact on energy balance and fat oxidation sufficient to assist with the prevention of obesity in sedentary , overweight young adults , a group at high risk for developing obesity The CONSORT statement is used worldwide to improve the reporting of r and omised controlled trials . Kenneth Schulz and colleagues describe the latest version , CONSORT 2010 , which up date s the reporting guideline based on new method ological evidence and accumulating experience . To encourage dissemination of the CONSORT 2010 Statement , this article is freely accessible on bmj.com and will also be published in the Lancet , Obstetrics and Gynecology , PLoS Medicine , Annals of Internal Medicine , Open Medicine , Journal of Clinical Epidemiology , BMC Medicine , and Trials INTRODUCTION We hypothesized that fruit/vegetable intake and eating behaviors mediate the relationship between experimental condition and weight loss in a r and omized trial evaluating a text-message based weight loss program . METHODS Overweight/obese individuals from San Diego , CA ( N = 52 with complete data ) were r and omly assigned in 2007 into one of two groups for four months : 1 ) the intervention group that received 2 - 5 weight management text-messages p/day ; 2 ) the usual-care comparison group . Three 24-hour recalls assessed fruit/vegetable intake change and the eating behavior inventory ( EBI ) measured change in eating behaviors . Regression path models tested intervention mediation . RESULTS Direct effects of the intervention were found for change in body weight ( b = -3.84 , R(2 ) = 0.074 ) , fruit/vegetable intake ( b = 2.00 , R(2 ) = 0.083 ) , and EBI scores ( b = 7.15 , R(2 ) = 0.229 ) ( ps < 0.05 ) . The treatment group to weight change path was not statistically significant ( b = -0.673 , R(2 ) = 0.208 ) when fruit/vegetable intake change and EBI score change were specified as intervention mediators in the model . The total indirect effect was 3.17 lb indicating that the indirect paths explained 82.6 % of the total effect on weight change . DISCUSSION Fruit/vegetable intake and eating behaviors mediated the intervention 's effect on weight change . The findings suggest that sending text-messages that promote healthy eating strategies result ed in moderate short-term weight loss
11,073
28,621,202
There was a statistically significant , small to medium effect for face-to-face delivered CBT in reducing suicidal ideation and behaviour although there was significant heterogeneity between the included studies .
Abstract Cognitive Behavioural Therapy ( CBT ) is a widely used psychotherapeutic intervention for suicide prevention despite its efficacy for suicide prevention in adults remaining ambiguous . Reluctance or inability to access face-to-face help suggests that e-health delivery may be a valuable re source for suicidal people . The aim of this study was to systematic ally review and conduct meta- analysis on research assessing the efficacy of CBT delivered via face-to-face and e-health for suicidal ideation and behaviour .
Cognitive therapy and antidepressant medications are effective treatments for depression , but little is known about their relative efficacy in reducing individual depressive symptoms . Using data from a recent clinical trial comparing cognitive therapy , antidepressant medication , and placebo in the treatment of moderate-to-severe depression , we examined whether there was a relative advantage of any treatment in reducing the severity of specific depressive symptom clusters . The sample consisted of 231 depressed out patients r and omly assigned to : cognitive therapy for 16 weeks ( n = 58 ) ; paroxetine treatment for 16 weeks ( n = 116 ) ; or pill placebo for 8 weeks ( n = 57 ) . Differential change in five subsets of depressive symptoms was examined : mood , cognitive/suicide , anxiety , typical-vegetative , and atypical-vegetative symptoms . Medication led to a greater reduction in cognitive/suicide symptoms relative to placebo by 4 weeks , and both active treatments reduced these symptoms more than did placebo by 8 weeks . Cognitive therapy reduced the atypical-vegetative symptoms more than placebo by 8 weeks and more than medications throughout the trial . These findings suggest that medications and cognitive therapy led to different patterns of response to specific symptoms of depression and that the general efficacy of these two well-vali date d treatments may be driven in large part by changes in cognitive or atypical-vegetative symptoms IMPORTANCE In the United States , approximately 1 physician dies by suicide every day . Training physicians are at particularly high risk , with suicidal ideation increasing more than 4-fold during the first 3 months of internship year . Despite this increase , to our knowledge , very few efforts have been made to prevent the escalation of suicidal thoughts among training physicians . OBJECTIVE To assess the effectiveness of a web-based cognitive behavioral therapy ( wCBT ) program delivered prior to the start of internship year in the prevention of suicidal ideation in medical interns . DESIGN , SETTING , AND PARTICIPANTS A r and omized clinical trial conducted at 2 university hospitals with 199 interns from multiple specialties during academic years 2009 - 2010 or 2011 - 2012 . The current study was conducted from May 2009 to June 2010 and May 2011 to June 2012 , and data were analyzed using intent-to-treat principles , including last observation carried forward . INTERVENTIONS Interns were r and omly assigned to 2 study groups ( wCBT and attention-control group [ ACG ] ) , and completed study activities lasting 30 minutes each week for 4 weeks prior to starting internship year . Participants assigned to wCBT completed online CBT modules and those assigned to ACG received emails with general information about depression , suicidal thinking , and local mental health professionals . MAIN OUTCOMES AND MEASURES The Patient Health Question naire-9 was used to assess suicidal ideation ( ie , " thoughts that you would be better off dead or hurting yourself in some way " ) prior to the start of intern year and at 3-month intervals throughout the year . RESULTS A total of 62.2 % of interns ( 199 of 320 ) agreed to take part in the study ; 100 were assigned to the wCBT group and 99 to the ACG . During at least 1 point over the course of internship year , 12 % of interns ( 12 of 100 ) assigned to wCBT endorsed suicidal ideation compared with 21.2 % of interns ( 21 of 99 ) assigned to ACG . After adjusting for covariates identified a priori that have previously shown to increase the risk for suicidal ideation , interns assigned to wCBT were less likely to endorse suicidal ideation during internship year ( relative risk , 0.40 ; 95 % CI , 0.17- 0.91 ; P = .03 ) compared with those assigned to ACG . CONCLUSIONS AND RELEVANCE This study demonstrates that a free , easily accessible , brief wCBT program is associated with reduced likelihood of suicidal ideation among medical interns . Prevention programs with these characteristics could be easily disseminated to medical training programs across the country . TRIAL REGISTRATION anzctr.org.au Identifier : ACTRN12610000628044 A system has been constructed to evaluate the design , implementation , and analysis of r and omized control trials ( RCT ) . The degree of quadruple blinding ( the r and omization process , the physicians and patients as to therapy , and the physicians as to ongoing results ) is considered to be the most important aspect of any trial . The analytic techniques are scored with the same emphasis as is placed on the control of bias in the planning and implementation of the studies . Description of the patient and treatment material s and the measurement of various controls of quality have less weight . An index of quality of a RCT is proposed with its pros and cons . If published papers were to approximate these principles , there would be a marked improvement in the quality of r and omized control trials . Finally , a reasonable st and ard design and conduct of trials will facilitate the interpretation of those with conflicting results and help in making valid combinations of undersized trials Objectives The effect of web-based interventions for depression on suicide ideation in callers to helplines is not known . The aim of this study was to determine if web-based Cognitive Behaviour Therapy ( CBT ) with and without telephone support is effective in reducing suicide ideation in callers to a helpline compared with treatment as usual ( TAU ) . A secondary aim was to examine the factors that predict change in suicide ideation . Putative predictors included level of baseline depression , suicide behaviour , baseline anxiety and type of intervention . Design R and omised controlled trial . Setting Lifeline , Australia 's 24 h telephone counselling service participants : 155 callers to a national helpline service with moderate-to-high psychological distress . Interventions Participants were recruited and r and omised to receive either 6 weeks of internet CBT plus weekly telephone follow-up ; internet CBT only ; weekly telephone follow-up only or a wait-list TAU control group . Primary and secondary outcome measures Suicidal ideation was measured using four items from the 28-item General Health Question naire . Predictors of change in ideation were tested using logistic regression analysis . Results Regardless of the intervention condition , participants showed significant reductions in suicidal ideation over 12 months ( p<0.001 ) . Higher baseline suicidal behaviour decreased the odds of remission of suicidal ideation at postintervention ( OR 0.409 , p<0.001 ) . However , change in depression over the course of the interventions was associated with improvement in suicide ideation ( OR 1.165 , p<0.001 ) . Conclusions Suicide ideation declines with and without proactive intervention . Improvements in depression are associated with the resolution of suicide ideation . Specific interventions focusing on suicide ideation should be further investigated . Trial registration Controlled-Trials.com IS RCT N93903959 OBJECTIVE The authors evaluated the effectiveness of brief cognitive-behavioral therapy ( CBT ) for the prevention of suicide attempts in military personnel . METHOD In a r and omized controlled trial , active-duty Army soldiers at Fort Carson , Colo. , who either attempted suicide or experienced suicidal ideation with intent , were r and omly assigned to treatment as usual ( N=76 ) or treatment as usual plus brief CBT ( N=76 ) . Assessment of incidence of suicide attempts during the follow-up period was conducted with the Suicide Attempt Self-Injury Interview . Inclusion criteria were the presence of suicidal ideation with intent to die during the past week and /or a suicide attempt within the past month . Soldiers were excluded if they had a medical or psychiatric condition that would prevent informed consent or participation in outpatient treatment , such as active psychosis or mania . To determine treatment efficacy with regard to incidence and time to suicide attempt , survival curve analyses were conducted . Differences in psychiatric symptoms were evaluated using longitudinal r and om-effects models . RESULTS From baseline to the 24-month follow-up assessment , eight participants in brief CBT ( 13.8 % ) and 18 participants in treatment as usual ( 40.2 % ) made at least one suicide attempt ( hazard ratio=0.38 , 95 % CI=0.16 - 0.87 , number needed to treat=3.88 ) , suggesting that soldiers in brief CBT were approximately 60 % less likely to make a suicide attempt during follow-up than soldiers in treatment as usual . There were no between-group differences in severity of psychiatric symptoms . CONCLUSIONS Brief CBT was effective in preventing follow-up suicide attempts among active-duty military service members with current suicidal ideation and /or a recent suicide attempt BACKGROUND Bereavement following suicide is associated with an increased vulnerability for depression , complicated grief , suicidal ideation , and suicide . There is , however , a paucity of studies of the effects of interventions in suicide survivors . AIMS This study therefore examined the effects of a cognitive behavioral therapy (CBT)-based psychoeducational intervention on depression , complicated grief , and suicide risk factors in suicide survivors . METHOD In total , 83 suicide survivors were r and omized to the intervention or the control condition in a cluster r and omized controlled trial . Primary outcome measures included maladaptive grief reactions , depression , suicidal ideation , and hopelessness . Secondary outcome measures included grief-related cognitions and coping styles . RESULTS There was no significant effect of the intervention on the outcome measures . However , the intensity of symptoms of grief , depressive symptoms , and passive coping styles decreased significantly in the intervention group but not in the control group . CONCLUSION The CBT-based psychoeducational intervention has no significant effect on the development of complicated grief reactions , depression , and suicide risk factors among suicide survivors . The intervention may , however , serve as supportive counseling for suicide survivors BACKGROUND Depression and alcohol misuse are among the most prevalent diagnoses in suicide fatalities . The risk posed by these disorders is exacerbated when they co-occur . Limited research has evaluated the effectiveness of common depression and alcohol treatments for the reduction of suicide vulnerability in individuals experiencing comorbidity . METHODS Participants with depressive symptoms and hazardous alcohol use were selected from two r and omised controlled trials . They had received either a brief ( 1 session ) intervention , or depression-focused cognitive behaviour therapy ( CBT ) , alcohol-focused CBT , therapist-delivered integrated CBT , computer-delivered integrated CBT or person-centred therapy ( PCT ) over a 10-week period . Suicidal ideation , hopelessness , depression severity and alcohol consumption were assessed at baseline and 12-month follow-up . RESULTS Three hundred three participants were assessed at baseline and 12 months . Both suicidal ideation and hopelessness were associated with higher severity of depressive symptoms , but not with alcohol consumption . Suicidal ideation did not improve significantly at follow-up , with no differences between treatment conditions . Improvements in hopelessness differed between treatment conditions ; hopelessness improved more in the CBT conditions compared to PCT and in single-focused CBT compared to integrated CBT . LIMITATIONS Low retention rates may have impacted on the reliability of our findings . Combining data from two studies may have result ed in heterogeneity of sample s between conditions . CONCLUSIONS CBT appears to be associated with reductions in hopelessness in people with co-occurring depression and alcohol misuse , even when it is not the focus of treatment . Less consistent results were observed for suicidal ideation . Establishing specific procedures or therapeutic content for clinicians to monitor these outcomes may result in better management of individuals with higher vulnerability for suicide BACKGROUND The World Health Organisation SUicide PREvention-Multisite Intervention Study on Suicide ( WHO/SUPRE-MISS ) investigates suicidal behaviours in a number of nations . The feasibility of the different branches of the study was piloted in Queensl and , Australia . This paper reports on the community survey component . METHOD R and omised telephone interviews ( n=11,572 ) were conducted to determine the lifetime prevalence of suicidal ideation and attempts , and corresponding socio-demographic and cultural characteristics . A subsequent postal survey sent to consenting individuals reporting lifetime suicide ideation/attempt ( n=1311 ) was meant to ascertain the possible development of that behaviour along a continuum , psychiatric and psychological factors , suicidal transmission , help-seeking , and service utilisation . RESULTS Suicide ideation and attempts prevailed in individuals aged 25 - 44 years , and declined with increasing age . In most cases , suicidal experience/s did not develop over time with progressively increasing severity . Knowledge of someone else 's suicidal behaviour significantly increased the risk of similar acts . Almost half of the subjects contended with their suicidal crisis by over-drinking alcohol , and 1/3 through other forms of reckless behaviour . The ratio completed/attempted suicide was 1 to 23 . Less than 30 % of subjects went to the hospital after their suicidal behaviour , and treatment received and staff attitudes were rated less favourably than that of General Practitioners . CONCLUSIONS This survey provides a reliable picture of suicide ideation and behaviour in the general population . Information on the development of suicidal process , recklessness , and help-seeking attitudes may be valuable for future prevention strategies The outcome of a r and omized controlled trial of cognitive behavior therapy in addition to treatment as usual ( CBT plus TAU ) compared with TAU alone ( TAU ) in one hundred and six participants meeting diagnostic criteria for borderline personality disorder is described . We anticipated that CBT plus TAU would decrease the number of participants with in-patient psychiatric hospitalizations or accident and emergency room contact or suicidal acts over twelve months treatment and twelve months follow-up , compared with TAU . We also anticipated that CBT plus TAU would lead to improvement in a range of secondary outcomes of mental health and social functioning compared to TAU . Of the 106 participants r and omized , follow-up data on 102 ( 96 % ) was obtained at two years . Those r and omized to CBT were offered an average of 27 sessions over 12 months and attended on average 16 ( range 0 to 35 ) . We found that the global odds ratio of a participant in the CBT plus TAU group compared with the TAU alone group having any of the outcomes of a suicidal act , in-patient hospitalization , or accident and emergency contact in the 24 months following r and omization was 0.86 ( 95 % confidence interval [ CI ] 0.45 to 1.66 , p = 0.66 ) . The corresponding global odds ratio , excluding accident and emergency room contact , was 0.75 ( 95 % CI 0.37 to 1.54 , p = 0.44 ) . In terms of the number of suicidal acts , there was a significant reduction over the two years in favor of CBT plus TAU over TAU , with a mean difference of -0.91 ( 95 % CI -1.67 to -0.15 , p = 0.020 ) . Across both treatment arms there was gradual and sustained improvement in both primary and secondary outcomes , with evidence of benefit for the addition of CBT on the positive symptom distress index at one year , and on state anxiety , dysfunctional beliefs and the quantity of suicidal acts at two year follow-up . CBT can deliver clinical ly important changes in relatively few clinical sessions in real clinical setting Objective To examine the effectiveness of a family based grief counselling programme to prevent complicated grief among first degree relatives and spouses of someone who had committed suicide . Design Cluster r and omised controlled trial with follow-up at 13 months after the suicide . Setting General practice s in the Netherl and s. Participants 122 first degree relatives and spouses of 70 people who committed suicide ; 39 families ( 68 participants ) were allocated to intervention , 31 families ( 54 participants ) to control . Intervention A family based , cognitive behaviour counselling programme of four sessions with a trained psychiatric nurse counsellor between three to six months after the suicide . Control participants received usual care . Main outcome measures Self report complicated grief . Secondary outcomes were the presence of maladaptive grief reactions , depression , suicidal ideation , and perceptions of being to blame for the suicide . Results The intervention was not associated with a reduction in complicated grief ( mean difference −0.61 , 95 % confidence interval −6.05 to 4.83 ; P=0.82 ) . Secondary outcomes were not affected either . When adjusted for baseline inequalities , the intervention reduced the risk of perceptions of being to blame ( odds ratio 0.18 , 0.05 to 0.67 ; P=0.01 ) and maladaptive grief reactions ( 0.39 , 0.15 to 1.01 ; P=0.06 ) . Conclusions A cognitive behaviour grief counselling programme for families bereaved by suicide did not reduce the risk of complicated grief or suicidal ideation or the level of depression . The programme may help to prevent maladaptive grief reactions and perceptions of blame among first degree relatives and spouses . Trial registration Current Controlled Trials IS RCT N66473618 CONTEXT Suicide attempts constitute a major risk factor for completed suicide , yet few interventions specifically design ed to prevent suicide attempts have been evaluated . OBJECTIVE To determine the effectiveness of a 10-session cognitive therapy intervention design ed to prevent repeat suicide attempts in adults who recently attempted suicide . DESIGN , SETTING , AND PARTICIPANTS R and omized controlled trial of adults ( N = 120 ) who attempted suicide and were evaluated at a hospital emergency department within 48 hours of the attempt . Potential participants ( N = 350 ) were consecutively recruited from October 1999 to September 2002 ; 66 refused to participate and 164 were ineligible . Participants were followed up for 18 months . INTERVENTION Cognitive therapy or enhanced usual care with tracking and referral services . MAIN OUTCOME MEASURES Incidence of repeat suicide attempts and number of days until a repeat suicide attempt . Suicide ideation ( dichotomized ) , hopelessness , and depression severity at 1 , 3 , 6 , 12 , and 18 months . RESULTS From baseline to the 18-month assessment , 13 participants ( 24.1 % ) in the cognitive therapy group and 23 participants ( 41.6 % ) in the usual care group made at least 1 subsequent suicide attempt ( asymptotic z score , 1.97 ; P = .049 ) . Using the Kaplan-Meier method , the estimated 18-month reattempt-free probability in the cognitive therapy group was 0.76 ( 95 % confidence interval [ CI ] , 0.62 - 0.85 ) and in the usual care group was 0.58 ( 95 % CI , 0.44 - 0.70 ) . Participants in the cognitive therapy group had a significantly lower reattempt rate ( Wald chi2(1 ) = 3.9 ; P = .049 ) and were 50 % less likely to reattempt suicide than participants in the usual care group ( hazard ratio , 0.51 ; 95 % CI , 0.26 - 0.997 ) . The severity of self-reported depression was significantly lower for the cognitive therapy group than for the usual care group at 6 months ( P= .02 ) , 12 months ( P = .009 ) , and 18 months ( P = .046 ) . The cognitive therapy group reported significantly less hopelessness than the usual care group at 6 months ( P = .045 ) . There were no significant differences between groups based on rates of suicide ideation at any assessment point . CONCLUSION Cognitive therapy was effective in preventing suicide attempts for adults who recently attempted suicide BACKGROUND AND AIMS In the past decade , a large body of research has demonstrated that internet-based interventions can have beneficial effects on depression . However , only a few clinical trials have compared internet-based depression therapy with an equivalent face-to-face treatment . The primary aim of this study was to compare treatment outcomes of an internet-based intervention with a face-to-face intervention for depression in a r and omized non-inferiority trial . METHOD A total of 62 participants suffering from depression were r and omly assigned to the therapist-supported internet-based intervention group ( n=32 ) and to the face-to-face intervention ( n=30 ) . The 8 week interventions were based on cognitive-behavioral therapy principles . Patients in both groups received the same treatment modules in the same chronological order and time-frame . Primary outcome measure was the Beck Depression Inventory-II ( BDI-II ) ; secondary outcome variables were suicidal ideation , anxiety , hopelessness and automatic thoughts . RESULTS The intention-to-treat analysis yielded no significant between-group difference ( online vs. face-to-face group ) for any of the pre- to post-treatment measurements . At post-treatment both treatment conditions revealed significant symptom changes compared to before the intervention . Within group effect sizes for depression in the online group ( d=1.27 ) and the face-to-face group ( d=1.37 ) can be considered large . At 3-month follow-up , results in the online group remained stable . In contrast to this , participants in the face-to-face group showed significantly worsened depressive symptoms three months after termination of treatment ( t=-2.05 , df=19 , p<.05 ) . LIMITATIONS Due to the small sample size , it will be important to evaluate these outcomes in adequately-powered trials . CONCLUSIONS This study shows that an internet-based intervention for depression is equally beneficial to regular face-to-face therapy . However , more long term efficacy , indicated by continued symptom reduction three months after treatment , could be only be found for the online group Abstract Objective To evaluate the efficacy of two internet interventions for community-dwelling individuals with symptoms of depression — a psychoeducation website offering information about depression and an interactive website offering cognitive behaviour therapy . Design R and omised controlled trial . Setting Internet users in the community , in Canberra , Australia . Participants 525 individuals with increased depressive symptoms recruited by survey and r and omly allocated to a website offering information about depression ( n = 166 ) or a cognitive behaviour therapy website ( n = 182 ) , or a control intervention using an attention placebo ( n = 178 ) . Main outcome measures Change in depression , dysfunctional thoughts ; knowledge of medical , psychological , and lifestyle treatments ; and knowledge of cognitive behaviour therapy . Results Intention to treat analyses indicated that information about depression and interventions that used cognitive behaviour therapy and were delivered via the internet were more effective than a credible control intervention in reducing symptoms of depression in a community sample . For the intervention that delivered cognitive behaviour therapy the reduction in score on the depression scale of the Center for Epidemiologic Studies was 3.2 ( 95 % confidence interval 0.9 to 5.4 ) . For the “ depression literacy ” site ( BluePages ) , the reduction was 3.0 ( 95 % confidence interval 0.6 to 5.2 ) . Cognitive behaviour therapy ( MoodGYM ) reduced dysfunctional thinking and increased knowledge of cognitive behaviour therapy . Depression literacy ( BluePages ) significantly improved participants ' underst and ing of effective evidence based treatments for depression ( P < 0.05 ) . Conclusions Both cognitive behaviour therapy and psychoeducation delivered via the internet are effective in reducing symptoms of depression BACKGROUND Self-harm is a major risk factor for completed suicide . AIMS To determine the efficacy of a brief psychological intervention - culturally adapted manual-assisted problem-solving training ( C-MAP ) - delivered following an episode of self-harm compared with treatment as usual ( TAU ) . METHOD The study was a r and omised controlled assessor-masked clinical trial ( trial registration : Clinical Trials.gov NCT01308151 ) . All patients admitted after an episode of self-harm during the previous 7 days to the participating medical units of three university hospitals in Karachi , Pakistan , were included in the study . A total of 250 patients were screened and 221 were r and omly allocated to C-MAP plus treatment as usual ( TAU ) or to TAU alone . All patients were assessed at baseline , at 3 months ( end of intervention ) and at 6 months after baseline . The primary outcome measure was reduction in suicidal ideation at 3 months . The secondary outcome measures included hopelessness , depression , coping re sources and healthcare utilisation . RESULTS A total of 108 patients were r and omised to the C-MAP group and 113 to the TAU group . Patients in the C-MAP group showed statistically significant improvement on the Beck Scale for Suicide Ideation and Beck Hopelessness Inventory , which was sustained at 3 months after the completion of C-MAP . There was also a significant reduction in symptoms of depression compared with patients receiving TAU . CONCLUSIONS The positive outcomes of this brief psychological intervention in patients attempting self-harm are promising and suggest that C-MAP may have a role in suicide prevention This study examines the efficacy of a short-term individual therapy , Manual Assisted Cognitive Treatment ( MACT ) , which was developed to treat parasuicidal ( suicidal or self-harming ) patients . In this trial , MACT was modified to focus on deliberate self-harm ( DSH ) in patients with borderline personality disorder ( BPD ) . Thirty BPD patients who were engaged in DSH while in ongoing treatments , i.e. , treatment-as-usual ( TAU ) , were r and omly assigned to receive MACT ( N = 15 ) or not . DSH and level of suicide ideation were assessed at the baseline , at completion of the MACT intervention , and six months later . Results indicated that MACT was associated with significantly less frequent DSH upon completion of the intervention and with significantly decreased DSH frequency and severity at the six months follow-up . Moreover , MACT 's contribution to reducing DSH frequency and severity was greater than the contribution by the amount of concurrent treatments . In contrast , MACT did not affect the level of suicide ideation and time-to-repeat of DSH . In conclusion , MACT seems to be a promising intervention for DSH in patients with BPD . More definitive studies are needed BACKGROUND Suicide behaviour in psychosis is a significant clinical and social problem . There is a dearth of evidence for psychological interventions design ed to reduce suicide risk in this population . AIMS To evaluate a novel , manualised , cognitive behavioural treatment protocol ( CBSPp ) based upon an empirically vali date d theoretical model . METHODS A r and omly controlled trial with independent and masked allocated and assessment of CBSPp with TAU ( n=25 , 24 sessions ) compared to TAU alone ( n=24 ) using st and ardised assessment s. Measures of suicide probability , and suicidal ideation were the primary outcomes and measures of hopelessness , depression , psychotic symptoms , functioning , and self-esteem were the secondary outcomes , assessed at 4 and 6 months follow-up . RESULTS The CBSPp group improved differentially to the TAU group on two out of three primary outcome measures of suicidal ideation and suicide probability , and on secondary outcomes of hopelessness related to suicide probability , depression , some psychotic symptoms and self-esteem . CONCLUSIONS CBSPp is a feasible intervention which has the potential to reduce proxy measures of suicide in psychotic patients Objective : The objective of the study was to find the efficacy of cognitive behaviour therapy ( CBT ) in the management of deliberate self‐harm ( DSH ) patients This study compared a 9-week individualised Cognitive Behaviour Therapy ( CBT ) programme for people with epilepsy ( PWE ) , with a wait-list control . Fifty-nine PWE were r and omised and 45 ( 75 % ) completed post-treatment outcomes . People with lower quality of life ( QoL ) , particularly for cognitive functioning , were more likely to drop out . Analyses based on treatment completers demonstrated significant improvements on the Neurological Depressive Disorders Inventory for Epilepsy ( p = .045 ) and Hospital Anxiety Depression Scale-Depression subscale ( p = .048 ) . Importantly , CBT significantly reduced the likelihood of clinical depressive symptoms ( p = .014 ) and suicidal ideation ( p = .005 ) . Improvements were not observed for anxiety , QoL or maintained overtime for depression . Results suggest that CBT was effective , however , and could be improved to increase patient retention and long-term outcomes BACKGROUND Studies on the effects of interventions in patients who have attempted suicide in China have not reported so far . AIMS To describe the basic situation surrounding the interventions and follow-up of patients who have attempted suicide and to determine whether the interventions would be effective in reducing repeat suicide attempts . METHOD 239 patients who had attempted suicide were evaluated in the emergency departments of four general hospitals . They were r and omized into three groups : cognitive therapy group , telephone intervention group , and control group . Postintervention the participants were evaluated at 3 , 6 , and 12 months separately by the following measurements : a detailed structured question naire , Beck Suicide Ideation Scale ( SIS ) , Hamilton Rating Scale for Depression ( HAMD ) , and a quality -of-life scale . RESULTS After 12 months , the cumulative dropout rate was 69.5 % ( n = 57 ) for the cognitive therapy group , 55.0 % ( n = 44 ) for the telephone intervention group , and 64.9 % ( n = 50 ) for the control group . One patient ( 1.2 % ) in the cognitive therapy group , one patient ( 1.3 % ) in the telephone intervention group , and five patients ( 6.5 % ) in the control group made at least one subsequent suicide attempt . The rates of repeated attempted suicide among the three groups were not significantly different ( χ² = 5.077 , p = .08 ) . Five patients ( 6.1 % ) received cognitive therapy , and 60 patients ( 75.0 % ) received telephone intervention . There were no differences regarding the score of HAMD , a quality -of-life scale , and the rates of subsequent suicide attempt and suicide ideation among the three groups at follow-up . CONCLUSIONS The dropout rates were higher than those reported in developed countries . Most participants in the cognitive therapy group refused to receive cognitive therapy so that the effect of cognitive therapy for these patients can not be evaluated . The participants in the telephone intervention group had good compliance , but the effect of telephone intervention could not be confirmed , so that more studies are needed in the future . Consequently , interventions can not be evaluated accurately in their preventing suicide attempts for patients who have attempted suicide in China at present Peters E , L and au S , McCrone P , Cooke M , Fisher P , Steel C , Evans R , Carswell K , Dawson K , Williams S , Howard A , Kuipers E. A r and omised controlled trial of cognitive behaviour therapy for psychosis in a routine clinical service Patients with schizophrenia are at high risk of suicide . Cognitive behavior therapy ( CBT ) has been shown to reduce symptoms in schizophrenia . This study examines whether CBT also changes the level of suicidal ideation in patients with schizophrenia compared to a control group . Ninety ambulatory patients with symptoms of schizophrenia resistant to conventional antipsychotic medication were r and omized to CBT or befriending . They were assessed using the Comprehensive Psychopathological Rating Scale , including a rating of suicidal ideation at baseline , post intervention , and after 9 months . Post-hoc analysis revealed that CBT provided significant reductions in suicidal ideation at the end of therapy , and sustained at the follow-up . Further research is required to substantiate these findings and determine the process and mechanisms through which this reduction is achieved OBJECTIVE People with substance use disorders who present with suicidal behavior are at high risk of subsequent suicide . There are few effective treatments specifically tailored for this population that diminish this risk . We aim ed to assess the impact of an opportunistic cognitive behavioral intervention package ( OCB ) among adult out patients with a substance use and comorbid suicide risk . METHOD A r and omized controlled trial was conducted across 2 sites in which 185 patients presenting with suicide risk and concurrent substance use received either OCB ( 8 sessions plus group therapy ) or treatment as usual ( TAU ) over a 6-month period . Primary outcomes were suicidal behavior ( suicide attempts , suicidal intent and presence of suicide ideation ) and level of drug and alcohol consumption . Secondary outcomes were changes in psychological measures of suicide ideation , depression , anxiety , and self-efficacy . RESULTS There were no completed suicides , and only 2 participants reported suicide attempts at follow-up . Suicide ideation , alcohol consumption , and cannabis use fell over time but no significant Treatment × Time differences were found . There were also no differences between OCB and TAU over time on psychological measures of depression , anxiety , or self-efficacy . Suicide ideation at 6-month follow-up was predicted by cannabis use and higher scores on the Brief Psychiatric Rating Scale at baseline . CONCLUSIONS The opportunistic cognitive behavioral intervention package did not appear to be beneficial in reducing suicide ideation , drug and alcohol consumption , or depression relative to treatment as usual
11,074
24,434,085
Because of high sensitivity of 94 % , a specificity of 99 % , and a low interobserver variability , this modality is currently preferred for the diagnosis of diverticulitis , although US also has a good sensitivity . Apart from diagnosing CRC , the detection of AA is of great importance because it bears the potential to progress to carcinoma . Colonoscopy is accompanied by such disadvantages as invasiveness and discomfort , potential adverse events such as perforation , and additional costs .
The use of routine colonoscopy after an episode of acute diverticulitis ( AD ) remains a point of debate . Most international and clinical practice guidelines advise endoscopy after conservatively treated diverticulitis . The rationale has always been to exclude an underlying malignancy or advanced colonic neoplasia ( ACN ) . However , this is based merely on expert opinion . A recent article indicated that presently this may be different with increased use of abdominal CT imaging of diverticulitis . Routine colonoscopy after an uncomplicated episode of diverticulitis date s from a time where the diagnosis was primarily based on clinical examination and laboratory results with frequent use of barium enema . However , in today ’s clinical practice , CT is widely used for the diagnosis of diverticulitis , with the possibility to assess potential adverse events such as abscess , fistula , obstruction , or perforation as well .
BACKGROUND Screening for colorectal cancer is widely recommended , but the preferred strategy remains unidentified . We aim ed to compare participation and diagnostic yield between screening with colonoscopy and with non-cathartic CT colonography . METHODS Members of the general population , aged 50 - 75 years , and living in the regions of Amsterdam or Rotterdam , identified via the registries of the regional municipal administration , were r and omly allocated ( 2:1 ) to be invited for primary screening for colorectal cancer by colonoscopy or by CT colonography . R and omisation was done per household with a minimisation algorithm based on age , sex , and socioeconomic status . Invitations were sent between June 8 , 2009 , and Aug 16 , 2010 . Participants assigned to CT colonography who were found to have one or more large lesions ( ≥10 mm ) were offered colonoscopy ; those with 6 - 9 mm lesions were offered surveillance CT colonography . The primary outcome was the participation rate , defined as number of invitees undergoing the examination relative to the total number of invitees . Diagnostic yield was calculated as number of participants with advanced neoplasia relative to the total number of invitees . Invitees and screening centre employees were not masked to allocation . This trial is registered in the Dutch trial register , number NTR1829 . FINDINGS 1276 ( 22 % ) of 5924 colonoscopy invitees participated , compared with 982 ( 34 % ) of 2920 CT colonography invitees ( relative risk [ RR ] 1·56 , 95 % CI 1·46 - 1·68 ; p<0·0001 ) . Of the participants in the colonoscopy group , 111 ( 9 % ) had advanced neoplasia of whom seven ( < 1 % ) had a carcinoma . Of CT colonography participants , 84 ( 9 % ) were offered colonoscopy , of whom 60 ( 6 % ) had advanced neoplasia of whom five ( < 1 % ) had a carcinoma ; 82 ( 8 % ) were offered surveillance . The diagnostic yield for all advanced neoplasia was 8·7 per 100 participants for colonoscopy versus 6·1 per 100 for CT colonography ( RR 1·46 , 95 % CI 1·06 - 2·03 ; p=0·02 ) and 1·9 per 100 invitees for colonoscopy and 2·1 per 100 invitees for CT colonography ( RR 0·91 , 0·66 - 2·03 ; p=0·56 ) . The diagnostic yield for advanced neoplasia of 10 mm or more was 1·5 per 100 invitees for colonoscopy and 2·0 per 100 invitees for CT colonography , respectively ( RR 0·74 , 95 % CI 0·53 - 1·03 ; p=0·07 ) . Serious adverse events related to the screening procedure were post-polypectomy bleedings : two in the colonoscopy group and three in the CT colonography group . INTERPRETATION Participation in colorectal cancer screening with CT colonography was significantly better than with colonoscopy , but colonoscopy identified significantly more advanced neoplasia per 100 participants than did CT colonography . The diagnostic yield for advanced neoplasia per 100 invitees was similar for both strategies , indicating that both techniques can be used for population -based screening for colorectal cancer . Other factors such as cost-effectiveness and perceived burden should be taken into account when deciding which technique is preferable . FUNDING Netherl and s Organisation for Health Research and Development , Centre for Translational Molecular Medicine , and the Nuts Ohra Foundation BACKGROUND Adenoma detection rate ( ADR ) has become the most important quality indicator for colonoscopy . OBJECTIVE The aim of this study was to investigate which modifiable factors , directly related to the endoscopic procedure , influenced the ADR in screening colonoscopies . DESIGN Observational , nested study . SETTING Multicenter , r and omized , controlled trials . PATIENTS Asymptomatic people aged 50 to 69 years were eligible for a multicenter , r and omized , controlled trial design ed to compare colonoscopy and fecal immunochemical testing in colorectal cancer screening . A total of 4539 individuals undergoing a direct screening colonoscopy were included in this study . INTERVENTION Colonoscopy . MAIN OUTCOME MEASUREMENTS Bowel cleansing , sedation , withdrawal time in normal colonoscopies , and cecal intubation were analyzed as possible predictors of adenoma detection by using logistic regression analysis , adjusted for age and sex . RESULTS In multivariate analysis , after adjustment for age and sex , factors independently related to the ADR were a mean withdrawal time longer than 8 minutes ( odds ratio [ OR ] 1.51 ; 95 % CI , 1.17 - 1.96 ) in normal colonoscopies and split preparation ( OR 1.26 ; 95 % CI , 1.01 - 1.57 ) . For advanced adenomas , only withdrawal time maintained statistical significance in the multivariate analysis . For proximal adenomas , withdrawal time and cecal intubation maintained independent statistical significance , whereas only withdrawal time longer than 8 minutes and a < 10-hour period between the end of preparation and colonoscopy showed independent associations for distal adenomas . LIMITATIONS Only endoscopic variables have been analyzed . CONCLUSION Withdrawal time was the only modifiable factor related to the ADR in colorectal cancer screening colonoscopies associated with an increased detection rate of overall , advanced , proximal , and distal adenomas BACKGROUND AND STUDY AIM Following acute diverticulitis , colonoscopy is advised to rule out malignancy . Commonly , the colonoscopy is postponed to avoid the potential risk of perforation . In a previous pilot , noncontrolled study , we showed that early colonoscopy is feasible in patients with acute diverticulitis . This r and omized controlled trial compared early and late colonoscopy in hospitalized patients with acute diverticulitis . PATIENTS AND METHODS 154 patients diagnosed with acute diverticulitis were hospitalized between January 2004 and June 2006 . Of these , 35 patients were excluded because of either free perforation or pericolic air on computed tomography ( CT ) , and another 18 because they had undergone colonoscopy in the previous year . The remaining 101 patients were offered the possibility of participating in the study , with r and om allocation to either early in-hospital colonoscopy or late colonoscopy , 6 weeks later . R and omization was refused by 15 patients , and 86 were included in the study . RESULTS 45 patients were r and omly allocated for early colonoscopy and 41 for late colonoscopy . Three and 10 did not present for the examination , in the early and late group respectively . The cecum could not be reached in eight and three patients from the early and late groups , respectively . The colonoscopy revealed polyps in five patients , two in the early group and three in the late group . No malignancy was detected . There were no complications in either group . CONCLUSIONS Early colonoscopy in acute diverticulitis is feasible and safe in the absence of pericolic air on CT , and has greater compliance . However , no added value is apparent compared with the CT scan currently used BACKGROUND Although the risk of bowel perforation is often cited as a major factor in the choice between colonoscopy and sigmoidoscopy for colorectal screening , good estimates of the absolute and relative risks of perforation are lacking . METHODS We used a large population -based cohort that consisted of a r and om sample of 5 % of Medicare beneficiaries living in regions of the United States covered by the Surveillance , Epidemiology , and End Results ( SEER ) Program registries to determine rates of perforation in people aged 65 years and older . We identified individuals who were cancer-free and had undergone colonoscopy or sigmoidoscopy between 1991 and 1998 , calculated both the incidence and risk of perforation within 7 days of the procedure , and explored the impact on incidence and risk of perforation of age , race/ethnicity , sex , comorbidities , and indication for the procedure . We also estimated the risk of death after perforation . Risks were calculated with odds ratios ( ORs ) and 95 % confidence intervals ( CIs ) . All statistical tests were two-sided . RESULTS There were 77 perforations after 39 286 colonoscopies ( incidence = 1.96/1000 procedures ) and 31 perforations after 35 298 sigmoidoscopies ( incidence = 0.88/1000 procedures ) . After adjustment , the OR for perforation from colonoscopy relative to perforation from sigmoidoscopy was 1.8 ( 95 % CI = 1.2 to 2.8 ) . Risk of perforation from either procedure increased in association with increasing age ( P(trend)<.001 for both procedures ) and the presence of two or more comorbidities ( P(trend)<.001 for colonoscopy and P(trend ) = .03 for sigmoidoscopy ) . Compared with those who were endoscopied and did not have a perforation , the risk of death was statistically significantly increased for those who had a perforation after either colonoscopy ( OR = 9.0 , 95 % CI = 3.0 to 27.3 ) or sigmoidoscopy ( OR = 8.8 , 95 % CI = 1.6 to 48.5 ) . The risk of perforation after colonoscopy , especially for screening procedures , declined during the 8-year study period . CONCLUSIONS The risk of perforation after colonoscopy is approximately double that after sigmoidoscopy , but this difference appears to be decreasing . These observations should be useful to clinicians making screening and diagnostic decisions for individual patients and to policy officials setting guidelines for colorectal cancer screening programs Several risk factors for colorectal cancer ( CRC ) have been identified . If individuals with risk factors are more likely to harbor cancer or it precursors screening programs should be targeted toward this population . We evaluated the predictive value of colorectal cancer risk factors for the detection of advanced colorectal adenoma in a population based CRC colonoscopy screening program . Data were collected in a multicenter trial conducted in the Netherl and s , in which 6600 asymptomatic men and women between 50 and 75 years were r and omly selected from a population registry . They were invited to undergo a screening colonoscopy . Based on a review of the literature CRC risk factors were selected . Information on risk factors was obtained from screening attendees through a question naire . For each CRC risk factor , we estimated its odds ratio ( OR ) relative to the presence of advanced neoplasia as detected at colonoscopy . Of the 1426 screening participants who underwent a colonoscopy , 1236 ( 86 % ) completed the risk question naire . 110 participants ( 8.9 % ) had advanced neoplasia . The following risk factors were significantly associated with advanced neoplasia detected by colonoscopy : age ( OR : 1.06 per year ; 95 % CI : 1.03 - 1.10 ) , calcium intake ( OR : 0.99 per mg ; 95 % CI : 0.99 - 1.00 ) , positive CRC family history ( OR : 1.55 per first degree family member ; 95%CI : 1.11 - 2.16 ) and smoking ( OR : 1.75 ; 95%CI : 1.09 - 2.82 ) . Elderly screening participants , participants with lower calcium intake , a CRC family history , and smokers are at increased risk of harboring detectable advanced colorectal neoplasia at screening colonoscopy To define the syndrome of vasovagal reactions that occur during colonoscopy and to identify those risk factors associated with this development , we prospect ively evaluated patients undergoing colonoscopy with monitored sedation . A total of 223 consecutive patients were evaluated during the 60-day study period . A vasovagal reaction was defined as the occurrence of one or more of the following : diaphoresis , sustained bradycardia of less than 60 beats/min or a decrease in heart rate of 10 % , or hypotension ( systolic blood pressure less than 90 mm Hg , diastolic blood pressure less than 60 mm Hg , or a reduction in blood pressure of more than 10 % below a baseline measurement before colonoscopy and after sedation ) . Thirty-seven ( 16.5 % ) of the 223 patients experienced a vasovagal reaction by our criteria . The remaining 186 patients did not ; 100 of these patients were r and omly selected by computer to form a control group . No statistically significant differences were observed between the vasovagal and control groups with regard to demographics , cardiopulmonary disease , cardiac medications , procedure success , the endoscopist , patient procedure tolerance , colon preparation , or procedure difficulty . A significant difference was seen in the mean dose of midazolam used in the vasovagal group as compared with that used in the control group ( 4.6 mg versus 3.9 mg , p < 0.04 ) , and moderate to severe diverticulosis was more commonly seen in the vasovagal group as compared with the control group ( 43 % versus 16 % , p < 0.02 ) . Thirteen ( 35 % ) of the 37 patients who had a vasovagal reaction required medical intervention ( 5.8 % of the 223 patients ) . ( ABSTRACT TRUNCATED AT 250 WORDS BACKGROUND Serrated cancers account for 10 % to 20 % of all colorectal cancers ( CRC ) and more than 30 % of interval cancers . The presence of proximal serrated polyps and large ( ≥10 mm ) serrated polyps ( LSP ) has been correlated with colorectal neoplasia . OBJECTIVE To evaluate the prevalence of serrated polyps and their association with synchronous advanced neoplasia in a cohort of average-risk population and to assess the efficacy of one-time colonoscopy and a biennial fecal immunochemical test for reducing CRC-related mortality . This study focused on the sample of 5059 individuals belonging to the colonoscopy arm . DESIGN Multicenter , r and omized , controlled trial . SETTING The ColonPrev study , a population -based , multicenter , nationwide , r and omized , controlled trial . PATIENTS A total of 5059 asymptomatic men and women aged 50 to 69 years . INTERVENTION Colonoscopy . MAIN OUTCOME MEASUREMENTS Prevalence of serrated polyps and their association with synchronous advanced neoplasia . RESULTS Advanced neoplasia was detected in 520 individuals ( 10.3 % ) ( CRC was detected in 27 [ 0.5 % ] and advanced adenomas in 493 [ 9.7 % ] ) . Serrated polyps were found in 1054 individuals ( 20.8 % ) . A total of 329 individuals ( 6.5 % ) had proximal serrated polyps , and 90 ( 1.8 % ) had LSPs . Proximal serrated polyps or LSPs were associated with male sex ( odds ratio [ OR ] 2.08 , 95 % confidence interval [ CI ] , 1.76 - 4.45 and OR 1.65 , 95 % CI , 1.31 - 2.07 , respectively ) . Also , LSPs were associated with advanced neoplasia ( OR 2.49 , 95 % CI , 1.47 - 4.198 ) , regardless of their proximal ( OR 4.15 , 95 % CI , 1.69 - 10.15 ) or distal ( OR 2.61 , 95 % CI , 1.48 - 4.58 ) locations . When we analyzed subtypes of serrated polyps , proximal hyperplasic polyps were related to advanced neoplasia ( OR 1.61 , 95 % CI , 1.13 - 2.28 ) , although no correlation with the location of the advanced neoplasia was observed . LIMITATIONS Pathology criteria for the diagnosis of serrated polyps were not central ly review ed . The morphology of the hyperplasic polyps ( protruded or flat ) was not recorded . Finally , because of the characteristics of a population -based study carried out in average-risk patients , the proportion of patients with CRC was relatively small . CONCLUSION LSPs , but not proximal serrated polyps , are associated with the presence of synchronous advanced neoplasia . Further studies are needed to determine the risk of proximal hyperplastic polyps BACKGROUND Prediction of a technically difficult colonoscopy may influence patient selection and procedure scheduling . Identification of predictive factors may be difficult because a common endpoint used to evaluate the success of colonoscopy is intubation of the cecum , which is usually achieved . The goal of this study was to examine the feasibility of using an alternative measure , time required for cecal intubation , to identify factors that can impact performance of colonoscopy . METHODS The time required for cecal intubation was prospect ively recorded for 802 consecutive outpatient colonoscopies performed by 7 experienced gastroenterologists . Patient data collected included height , weight , age , bowel habits , surgical history , and findings at colonoscopy . Forty-seven examinations that were stopped because of disease or unacceptable bowel preparation were excluded . The impact of the patient characteristics of the remaining sample of 755 patients on the median time required for cecal intubation for men and women was examined . RESULTS Older age and female gender , body mass index < or = 25.0 ( regardless of gender ) , diverticular disease in women , and a history of constipation or reported laxative use in men were predictors of difficult colonoscopy . CONCLUSIONS By using median time required for cecal intubation , several patient characteristics were identified that may predict technical difficulty at colonoscopy . These findings have implication s for practice and teaching In order to elaborate evidence -based , national Danish guidelines for the treatment of diverticular disease the literature was review ed concerning the epidemiology , staging , diagnosis and treatment of diverticular disease in all its aspects . The presence of colonic diverticula , which is considered to be a mucosal herniation through the intestinal muscle wall , is inversely correlated to the intake of dietary fibre . Other factors in the genesis of diverticular disease may be physical inactivity , obesity , and use of NSAIDs or acetaminophen . Diverticulosis is most common in Western countries with a prevalence of 5 % in the population aged 30 - 39 years and 60 % in the part of the population > 80 years . The incidence of hospitalization for acute diverticulitis is 71/100,000 and the incidence of complicated diverticulitis is 3.5 - 4/100,000 . Acute diverticulitis is conveniently divided into uncomplicated and complicated diverticulitis . Complicated diverticulitis is staged by the Hinchey classification 1 - 4 ( 1 : mesocolic/pericolic abscess , 2 : pelvic abscess , 3 : purulent peritonitis , 4 : faecal peritonitis ) . Diverticulitis is suspected in case of lower left quadrant abdominal pain and tenderness associated with fever and raised WBC and /or CRP ; but the clinical diagnosis is not sufficiently precise . Abdominal CT confirms the diagnosis and enables the classification of the disease according to Hinchey . The distinction between Hinchey 3 and 4 is done by laparoscopy or , when not possible , by laparotomy . Uncomplicated diverticulitis is treated by conservative means . There is no evidence of any beneficial effect of antibiotics in uncomplicated diverticulitis , but antibiotics may be used in selected cases depending on the overall condition of the patients and the severity of the infection . Abscess formation is best treated by US- or CT-guided drainage in combination with antibiotics . When the abscess is < 3 cm in diameter , drainage may be unnecessary , and only antibiotics should be instituted . The surgical treatment of acute perforated diverticulitis has interchanged between resection and non-resection strategies : The three-stage procedure dominating in the beginning of the 20th century was later replaced by the Hartmann procedure or , alternatively , resection of the sigmoid with primary anastomosis . Lately a non-resection strategy consisting of laparoscopy with peritoneal lavage and drainage has been introduced in the treatment of Hinchey stage 3 disease . Evidence so far for the lavage regime is promising , comparing favourably with resection strategies , but lacking in solid proof by r and omized , controlled investigations . In recent years , morbidity has declined in complicated diverticulitis due to improved diagnostics and new treatment modalities . Recurrent diverticulitis is relatively rare and furthermore often uncomplicated than previously assumed . Elective surgery in diverticular disease should probably be limited to symptomatic cases not amenable to conservative measures , since prophylactic resection of the sigmoid , evaluated from present evidence , confers unnecessary risks in terms of morbidity and mortality to the individual as well as unnecessary costs to society . Any recommendation for routine resection following multiple cases of diverticulitis should await results of r and omized studies . Laparoscopic resection is preferred in case of need for elective surgery . When malignancy is ruled out preoperatively , a sigmoid resection with preservation of the inferior mesenteric artery , oral division of colon in soft compliant tissue and anastomosis to upper rectum is recommended . Fistulae to bladder or vagina , or stenosis of the colon may be dealt with according to symptoms and comorbidity . Resection of the diseased segment of colon is preferred when possible and safe ; alternatively , a diverting stoma can be the best solution Abstract Objective . The aim of this study was to assess CT-colonography ( CTC ) in the follow-up of diverticulitis regarding patient acceptance and diagnostic accuracy for diverticular disease , adenomas and cancer , with colonoscopy as a reference st and ard . Methods . A prospect i ve comparative study where half of the patients underwent colonoscopy first , followed immediately by CTC . The other half had the examinations in the reverse order . Patient experiences and findings were registered after every examination , blinded to the examiner . Results . Of a total of 110 consecutive patients , 108 were included in the study , with a median age of 56 years ( range 27–84 ) . The success rate was 91 % for colonoscopy and 86 % for CTC . Examination time was 25 min for both methods . The mean time for CTC evaluation was 20 min . Eighty-three per cent of the patients received sedation during colonoscopy . Despite this , patients experienced colonoscopy as more painful ( p < 0.001 ) and uncomfortable ( p < 0.001 ) . Diverticulosis and polyps were detected in 94 % and 20 % with colonoscopy and in 94 % and 29 % with CTC , respectively . Sensitivity and specificity for CTC in the detection of diverticulosis was 99 % and 67 % , with a good agreement ( κ = 0.71 ) . Regarding detection of polyps , the sensitivity and specificity were 47 % and 75 % , with a poor agreement ( κ = 0.17 ) . No cancer was found . Conclusion . CTC was less painful and unpleasant and can be used for colonic investigation in the follow-up of diverticulitis . CTC detected diverticulosis with good accuracy while the detection accuracy of small polyps was poor . CTC is a viable alternative , especially in case of incomplete colonoscopy or in a situation with limited colonoscopy re sources BACKGROUND Colonoscopy and fecal immunochemical testing ( FIT ) are accepted strategies for colorectal-cancer screening in the average-risk population . METHODS In this r and omized , controlled trial involving asymptomatic adults 50 to 69 years of age , we compared one-time colonoscopy in 26,703 subjects with FIT every 2 years in 26,599 subjects . The primary outcome was the rate of death from colorectal cancer at 10 years . This interim report describes rates of participation , diagnostic findings , and occurrence of major complications at completion of the baseline screening . Study outcomes were analyzed in both intention-to-screen and as-screened population s. RESULTS The rate of participation was higher in the FIT group than in the colonoscopy group ( 34.2 % vs. 24.6 % , P<0.001 ) . Colorectal cancer was found in 30 subjects ( 0.1 % ) in the colonoscopy group and 33 subjects ( 0.1 % ) in the FIT group ( odds ratio , 0.99 ; 95 % confidence interval [ CI ] , 0.61 to 1.64 ; P=0.99 ) . Advanced adenomas were detected in 514 subjects ( 1.9 % ) in the colonoscopy group and 231 subjects ( 0.9 % ) in the FIT group ( odds ratio , 2.30 ; 95 % CI , 1.97 to 2.69 ; P<0.001 ) , and nonadvanced adenomas were detected in 1109 subjects ( 4.2 % ) in the colonoscopy group and 119 subjects ( 0.4 % ) in the FIT group ( odds ratio , 9.80 ; 95 % CI , 8.10 to 11.85 ; P<0.001 ) . CONCLUSIONS Subjects in the FIT group were more likely to participate in screening than were those in the colonoscopy group . On the baseline screening examination , the numbers of subjects in whom colorectal cancer was detected were similar in the two study groups , but more adenomas were identified in the colonoscopy group . ( Funded by Instituto de Salud Carlos III and others ; Clinical Trials.gov number , NCT00906997 . )
11,075
15,846,608
Nutritional support had no significant effect on anthropometric measures , lung function or exercise capacity in patients with stable COPD
BACKGROUND Low body weight in patients with chronic obstructive pulmonary disease ( COPD ) is associated with an impaired pulmonary status , reduced diaphragmatic mass , lower exercise capacity and higher mortality rate when compared to adequately nourished individuals with this disease . Nutritional support may therefore be a useful part of their comprehensive care . OBJECTIVES To conduct a systematic review of r and omised controlled trials ( RCTs ) to clarify whether nutritional supplementation ( caloric supplementation for at least 2 weeks ) improved anthropometric measures , pulmonary function , respiratory muscle strength and functional exercise capacity in patients with stable COPD .
The inflammatory cytokines , tumor necrosis factor-alpha ( TNF-alpha ) and interleukin-1-beta ( IL-1 beta ) , have been associated with accelerated metabolism and protein turnover following exogenous administration in normal humans . We hypothesized that these inflammatory cytokines might contribute to the weight-losing process in patients with chronic obstructive pulmonary disease ( COPD ) . COPD patients were identified prospect ively as " weight losers " ( WL ; n = 10 ) if they reported > 5 % weight loss during the preceding year or as " weight stable " ( WS ; n = 10 ) if their body weight fluctuated < or = 5 % . Age-matched healthy volunteers were selected as the control group ( C ; n = 13 ) . Monocytes were isolated from a peripheral blood sample , cultured , and exposed to lipopolysaccharide ( LPS ) . The concentration of TNF-alpha and IL-1 beta in the monocyte supernatant was measured using a four layer enhanced ELISA . No significant difference in LPS-stimulated IL-1 beta production was found in the three study population s. However , LPS-stimulated TNF-alpha production ( mean [ range ] ng/ml ) by monocytes was significantly higher in the WL COPD patients ( 20.2 [ 6.3 to 44.8 ] ) , compared with WS patients ( 6.9 [ 1.5 to 16.6 ] ) , and C subjects ( 5.7 [ 0 to 61.8 ] ) . This difference was not maintained at 6 mo follow-up in the absence of ongoing weight loss . Definition of a causal relationship between TNF-alpha production and weight loss will require further underst and ing of the relationship between energy metabolism and TNF-alpha production in these patients The relation between vitamin A status and the degree of lung airway obstruction was examined in a cross-sectional study of 36 male subjects aged 43 - 74 y who were assigned to five groups as follows : healthy nonsmokers ( n = 7 ) , healthy smokers ( n = 7 ) , mild chronic obstructive pulmonary disease ( COPD -mild ) patients ( n = 9 ) , COPD -moderate-severe patients ( n = 7 ) , and COPD -moderate-severe patients with exacerbation ( + ex ; n = 6 ) . Smoking habits , pulmonary function tests , energy-protein status were assessed ; serum concentrations of retinyl esters , retinol , retinol binding protein , and transthyretin and relative dose responses were measured . In addition , 12 male smokers aged 45 - 61 y with mild COPD were r and omly assigned to two groups for a longitudinal study : six subjects consumed vitamin A ( 1000 RE/d ; COPD -vitamin A ) and six subjects received placebo for 30 d. Lowered serum retinol concentrations were found in the COPD -moderate-severe and COPD -moderate-severe+ex groups . Measurements of vitamin A status in healthy smokers and in COPD -mild patients were not different from those in healthy nonsmokers . The improvement of pulmonary function test results after vitamin A supplementation [ mean increase for 1-s forced expiratory volume ( FEV1 ) = 22.9 % in the COPD -vitamin A group ] may support the assumption of a local ( respiratory ) vitamin A deficiency in patients with this disease STUDY OBJECTIVE Weight loss is a common complication of COPD , associated with negative outcomes . Weight restoration has been associated with improved outcomes . The effects of ox and rolone , an adjunct to help restore weight , were evaluated in patients with COPD . DESIGN Prospect i ve , open-label , 4-month clinical trial . SETTING Twenty-five community-based pulmonary practice s throughout the United States . PATIENTS A primary pulmonary diagnosis of moderate-to-severe COPD as defined by FEV1 < 50 % of predicted and FEV1/FVC ratio < 0.7 , along with significant involuntary weight loss ( weight < or = 90 % ideal body weight ) . INTERVENTIONS Oral ox and rolone , 10 mg bid . MEASUREMENTS AND RESULTS Body weight , body composition ( bioelectric impedance analysis ) , spirometry , and 6-min walking distance were measured . Data for 82 patients at 2 months and 55 patients at 4 months are presented . At month 2 , 88 % of patients had gained a mean + /- SD of 6.0 + /- 4.36 lb ( p < 0.05 ) and 12 % had lost a mean of 1.7 + /- 2.15 lb ( not statistically significant [ NS ] ) . At month 4 , 84 % had gained a mean of 6.0 + /- 5.83 lb ( p < 0.05 ) and 16 % had lost a mean of 1.8 + /- 1.74 lb ( NS ) . Month 4 bioelectric impedance analysis showed the weight to be primarily lean tissue , with a mean increase in body cell mass of 3 + /- 2.6 lb ( p < 0.05 ) , and a mean increase in fat of 1.2 + /- 4.6 lb ( NS ) . CONCLUSIONS Ox and rolone is an effective adjunct to facilitate weight restoration in patients with COPD -associated weight loss . Weight gain is primarily lean body mass . Ox and rolone was relatively well tolerated and , therefore , should be a consideration in the comprehensive management of patients with COPD and weight loss The effects of adding L-carnitine to a whole-body and respiratory training program were determined in moderate-to-severe chronic obstructive pulmonary disease ( COPD ) patients . Sixteen COPD patients ( 66 + /- 7 years ) were r and omly assigned to L-carnitine ( CG ) or placebo group ( PG ) that received either L-carnitine or saline solution ( 2 g/day , orally ) for 6 weeks ( forced expiratory volume on first second was 38 + /- 16 and 36 + /- 12 % , respectively ) . Both groups participated in three weekly 30-min treadmill and threshold inspiratory muscle training sessions , with 3 sets of 10 loaded inspirations ( 40 % ) at maximal inspiratory pressure . Nutritional status , exercise tolerance on a treadmill and six-minute walking test , blood lactate , heart rate , blood pressure , and respiratory muscle strength were determined as baseline and on day 42 . Maximal capacity in the incremental exercise test was significantly improved in both groups ( P < 0.05 ) . Blood lactate , blood pressure , oxygen saturation , and heart rate at identical exercise levels were lower in CG after training ( P < 0.05 ) . Inspiratory muscle strength and walking test tolerance were significantly improved in both groups , but the gains of CG were significantly higher than those of PG ( 40 + /- 14 vs 14 + /- 5 cmH2O , and 87 + /- 30 vs 34 + /- 29 m , respectively ; P < 0.05 ) . Blood lactate concentration was significantly lower in CG than in PG ( 1.6 + /- 0.7 vs 2.3 + /- 0.7 mM , P < 0.05 ) . The present data suggest that carnitine can improve exercise tolerance and inspiratory muscle strength in COPD patients , as well as reduce lactate production The impact of oral nutritional supplementation during an acute exacerbation of COPD on functional status was assessed through measuring change in lung function , strength testing , and general well-being . Subjects hospitalized for an acute exacerbation of COPD ( n = 33 ) were r and omized to extra nutritional support or the regular hospital care . They consumed an additional 10 kcal/kg/d . Outcome measures were measured at 2 wk as change scores . Forced vital capacity ( % predicted ) improved in the treatment group as compared with the control group ( + 8.7 % versus -3.5 % , p = 0.015 ) , and change in FEV1 was in the same direction but not significantly different ( p = 0.099 ) . There were no changes in h and grip strength or respiratory muscle strength , but there was a trend towards more improvement in the general well-being score ( + 11.96 versus -10.25 , p = 0.066 ) . Almost all subjects were in negative nitrogen balance , indicating muscle wasting . The degree of muscle wasting was strongly correlated with the dose of corticosteroids ( r = 0.73 , p < 0.005 ) . In conclusion , it is difficult to prevent important muscle wasting in patients with COPD treated with corticosteroids , but some small gains were observed with increased dietary intake Nutritional depletion commonly occurs in patients with COPD , causing muscle wasting and impaired physiologic function . Two hundred seventeen patients with COPD participated in a placebo-controlled , r and omized trial investigating the physiologic effects of nutritional intervention alone ( N ) for 8 wk or combined with the anabolic steroid n and rolone decanoate ( N + A ) . N and rolone decanoate or placebo ( P ) was injected intramuscularly ( women , 25 mg ; men , 50 mg ) in a double-blind fashion on Days 1 , 15 , 29 , and 43 . Nutritional intervention consisted of a daily high caloric supplement ( 420 kcal ; 200 ml ) . Also , all patients participated in an exercise program . In the depleted patients , both treatment regimens induced a similar significant body weight gain ( 2.6 kg ) but different body compositional changes . Particularly in the last 4 wk of treatment , weight gain in the N group was predominantly due to an expansion of fat mass ( p < 0.03 versus P and N + A ) , whereas the relative changes in fat-free mass ( FFM ) and other measures of muscle mass were more favorable in the N + A group ( p < 0.03 versus P ) . Maximal inspiratory mouth pressure improved within both treatment groups in the first 4 wk of treatment , but after 8 wk only N + A was significantly different from P ( p < 0.03 ) . Nutritional supplementation in combination with a short course of anabolic steroids may enhance the gain in FFM and respiratory muscle function in depleted patients with COPD without causing adverse side effects Background : Pulmonary rehabilitation is effective in improving exercise performance and health status in chronic obstructive pulmonary disease ( COPD ) . However , the role of nutritional support in the enhancement of the benefits of exercise training has not been explored . A double blind , r and omised , controlled trial of carbohydrate supplementation was undertaken in patients attending outpatient pulmonary rehabilitation . Methods : 85 patients with COPD were r and omised to receive a 570 kcal carbohydrate rich supplement or a non-nutritive placebo daily for the duration of a 7 week outpatient pulmonary rehabilitation programme . Primary outcome measures were peak and submaximal exercise performance using the shuttle walk tests . Changes in health status , body composition , muscle strength , and dietary macronutrient intake were also measured . Results : Patients in both the supplement and placebo groups increased shuttle walking performance and health status significantly . There was no statistically significant difference between treatment groups in these outcomes . Patients receiving placebo lost weight whereas supplemented patients gained weight . In well nourished patients ( BMI > 19 kg/m2 ) improvement in incremental shuttle performance was significantly greater in the supplemented group ( mean difference between groups : 27 ( 95 % CI 1 to 53 ) m , p<0.05 ) . Increases in incremental shuttle performance correlated with increases in total carbohydrate intake . Conclusions : When universally prescribed , carbohydrate supplementation does not enhance the rehabilitation of patients with COPD . This study suggests that exercise training results in negative energy balance that can be overcome by supplementation and that , in selected patients , this may improve the outcome of training . The finding of benefit in well nourished patients may suggest a role for nutritional supplementation beyond the treatment of weight loss in COPD Unexplained weight loss is common in chronic obstructive pulmonary disease ( COPD ) . Blood levels of tumor necrosis factor-alpha ( TNF-alpha ) , a cytokine causing cachexia in laboratory animals , are elevated in various human diseases associated with weight loss . We therefore prospect ively measured TNF-alpha serum levels ( immunoradiometric assay ) in patients with clinical ly stable COPD ( n = 30 ; all male ; mean age , 65 yr ) whose weight was less ( Group I ; n = 16 ) or more ( Group II ; n = 14 ) than the lower limit of normal taken from Metropolitan Life Insurance Company tables . The patients had no cause known to elevate TNF-alpha serum levels ; notably , they were not infected . Group I patients had unintentionally lost weight during the previous year , whereas the weight of Group II patients had not changed during the same period . The two groups had similar chronic airflow obstruction and arterial blood gas impairment ; hyperinflation and reduction in diffusing capacity were more pronounced in Group I , but differences were not significant . TNF-alpha serum levels ( pg/ml ; mean [ SD ] ) were significantly higher in Group I than in Group II ( 70.2 [ 100.0 ] versus 6.7 [ 6.4 ] ; p < 0.001 ) . Group II TNF-alpha serum levels did not differ significantly from those of healthy subjects ( 7.8 [ 3.9 ] ) , whereas those of Group I were significantly higher ( p < 0.001 ) . Because renal function was in the normal range , we conclude that increased TNF-alpha production-- and not decreased TNF-alpha clearance -- is a likely cause of weight loss in patients with COPD The purpose of this study was to determine whether a single large liquid carbohydrate ( CHO ) load ( 920 calories ) affects walking performance in patients with chronic air-flow obstruction ( CAO ) . Walking performance was measured using the 12-min walking test . Fifteen patients with stable CAO ( FEV1 , 1.30 + /- 0.41 L ; FVC , 3.26 + /- 0.46 L ) underwent 12-min walking tests 40 min after ingestion of either CHO or placebo on consecutive days in r and omized double-blind fashion . Three practice walks were performed on a preliminary day in order to eliminate learning effects . Resting measurements of ventilation ( VE ) and carbon dioxide output ( VCO2 ) were obtained prior to each walking test . Carbohydrate significantly increased both VCO2 ( from 0.288 + /- 0.060 to 0.372 + /- 0.057 L/min , p less than 0.001 ) and VE ( from 15.2 + /- 3.5 to 18.5 + /- 3.1 L/min , p less than 0.001 ) at rest . The total 12-minute walking distance decreased from 894 + /- 199 to 847 + /- 191 m following CHO ( p less than 0.005 ) . This distance decreased in 14 of the 15 study patients . The decrease in walking distance ranged from 1.5 to 168 m ( 0.2 to 15.2 % ) . From this study we conclude that a large liquid carbohydrate load adversely affects walking performance in patients with CAO . This potential impairment of functional capacity should be considered when caloric intake is increased in attempts to improve nutritional status in this patient population A body weight lower than 90 % of the optional value has an unfavorable influence on the prognosis of chronic obstructive pulmonary disease ( COPD ) . Short term studies of up to three months duration have shown improved function of respiratory muscle exercise tolerance and immunologic parameters by an increased caloric intake of 45 kcal/kg body weight . In a r and omized trial of twelve months 14 of 30 patients with an average FEV1 of 0.8 l were instructed to take a high calorie diet . For simplicity a part of the calories were administered as Fresubin , a fluid nutrient formula . Although a weight gain of 7 kg ( p = 0.003 ) was obtained the difference to the control group was statistically not significant ( p = 0.08 ) . The same was true for skin fold thickness ( 12.4 vs 5.7 mm ) , change of ventilatory parameters and the 6 minute walking distance ( -33 vs -86 m ) . Subjective improvement was , however , impressive in all patients with dietary intervention , explainable probably by increased attention . Dietary counselling for increased intake of calories , vitamins and also calcium is thus very important in the treatment of patients with COPD BACKGROUND Chronic obstructive pulmonary disease ( COPD ) is characterized by an incompletely reversible limitation in airflow . A physiological variable -- the forced expiratory volume in one second (FEV1)--is often used to grade the severity of COPD . However , patients with COPD have systemic manifestations that are not reflected by the FEV1 . We hypothesized that a multidimensional grading system that assessed the respiratory and systemic expressions of COPD would better categorize and predict outcome in these patients . METHODS We first evaluated 207 patients and found that four factors predicted the risk of death in this cohort : the body-mass index ( B ) , the degree of airflow obstruction ( O ) and dyspnea ( D ) , and exercise capacity ( E ) , measured by the six-minute-walk test . We used these variables to construct the BODE index , a multidimensional 10-point scale in which higher scores indicate a higher risk of death . We then prospect ively vali date d the index in a cohort of 625 patients , with death from any cause and from respiratory causes as the outcome variables . RESULTS There were 25 deaths among the first 207 patients and 162 deaths ( 26 percent ) in the validation cohort . Sixty-one percent of the deaths in the validation cohort were due to respiratory insufficiency , 14 percent to myocardial infa rct ion , 12 percent to lung cancer , and 13 percent to other causes . Patients with higher BODE scores were at higher risk for death ; the hazard ratio for death from any cause per one-point increase in the BODE score was 1.34 ( 95 percent confidence interval , 1.26 to 1.42 ; P<0.001 ) , and the hazard ratio for death from respiratory causes was 1.62 ( 95 percent confidence interval , 1.48 to 1.77 ; P<0.001 ) . The C statistic for the ability of the BODE index to predict the risk of death was larger than that for the FEV1 ( 0.74 vs. 0.65 ) . CONCLUSIONS The BODE index , a simple multidimensional grading system , is better than the FEV1 at predicting the risk of death from any cause and from respiratory causes among patients with COPD Dysfunction of the muscles of ambulation contributes to exercise intolerance in chronic obstructive pulmonary disease ( COPD ) . Men with COPD have high prevalence of low testosterone levels , which may contribute to muscle weakness . We determined effects of testosterone supplementation ( 100 mg of testosterone enanthate injected weekly ) with or without resistance training ( 45 minutes three times weekly ) on body composition and muscle function in 47 men with COPD ( mean FEV(1 ) = 40 % predicted ) and low testosterone levels ( mean = 320 ng/dl ) . Subjects were r and omized to 10 weeks of placebo injections + no training , testosterone injections + no training , placebo injections + resistance training , or testosterone injections + resistance training . Testosterone injections yielded a mean increase of 271 ng/dl in the nadir serum testosterone concentration ( to the middle of the normal range for young men ) . The lean body mass ( by dual-energy X-ray absorptiometry ) increase averaged 2.3 kg with testosterone alone and 3.3 kg with combined testosterone and resistance training ( p < 0.001 ) . Increase in one-repetition maximum leg press strength averaged 17.2 % with testosterone alone , 17.4 % with resistance training alone , and 26.8 % with testosterone + resistance training ( p < 0.001 ) . Interventions were well tolerated with no abnormalities in safety measures . Further studies are required to determine long-term benefits of adding testosterone supplementation and resistance training to rehabilitative programs for carefully screened men with COPD and low testosterone levels Although increasing attention has been paid to nutritional aspects in chronic obstructive pulmonary disease ( COPD ) , limited information is available regarding the prevalence and consequences of nutritional depletion in a r and om out-patient COPD population . We studied body composition in relation to respiratory and peripheral skeletal muscle function in 72 COPD patients ( mean ( SD ) forced expiratory volume in one second ( FEV1 ) 53 ( 15 ) % predicted ) , who came to the lung function laboratory for routine lung function measurements . Patients were characterized by the degree of body weight loss and fat-free mass depletion . According to this definition , 14 % of the group suffered from both loss of body weight and depletion of fat-free mass , whereas 7 % had one of these conditions . We found that tissue depletion was concomitant with lower values for respiratory and peripheral skeletal muscle strength ( 46.0 ( 27.2 ) vs 77.1 ( 29.8 ) kg ) , and a significantly lower transfer coefficient for carbon monoxide ( KCO 64.9 ( 16.2 ) vs 81.9 ( 24.5 ) % pred ) . Stratification by KCO ( < 60 % vs > 80 % ) also revealed significantly lower values for fat-free mass and higher values for intrathoracic gas volumes , total lung capacity ( TLC ) and residual volume ( RV ) in the group with a KCO < 60 % pred . Analysis of covariance , taking fat-free mass as covariate , indicated an independent contribution of KCO on maximal inspiratory mouth pressure ( PImax ) but not on peripheral skeletal muscle strength . It is concluded that a substantial number of COPD out- patients suffer from nutritional depletion , preferentially affecting peripheral skeletal muscle function The purpose of this study was to compare the effects of isocaloric liquid meals with high fat ( 55 percent ) and low carbohydrate ( 28 percent ) content ( Pulmocare ) to meals with low fat ( 30 percent ) and high carbohydrate ( 53 percent ) content ( Ensureplus ) on exercise performance in subjects with chronic airflow obstruction ( CAO ) . Twelve stable subjects with CAO ( FEV1 = 1.30 + /- 0.47 L ) underwent incremental symptom-limited exercise tests 90 minutes following the ingestion of 920 calories of EnsurePlus HN ( E ) , 920 calories of Pulmocare ( P ) , or a noncaloric placebo ( C ) . Tests were performed on three days , in a double-blind r and omized fashion . Expired gases were collected continuously and analyzed every 30 seconds . The mean maximal work load after E ( 81 + /- 24 W ) was significantly less than that after P ( 88 + /- 21 W ) or C ( 88 + /- 24 W ) . The mean ventilation at exhaustion was similar after E ( 48 + /- 13 L/min ) , P ( 51 + /- 11 L/min ) , and C ( 49 + /- 10 L/min ) . In comparison to C , six of the 12 individuals had a decreased work load following E , while only one had a decreased maximal tolerated work load following P. The results of this study suggest that meals with a higher fat and lower carbohydrate content may be less likely to impair work performance of patients with CAO in the absorptive phase than meals with a lower fat and higher carbohydrate content . These findings may have clinical significance to patients with CAO who complain of postpr and ial exertional dyspnea BACKGROUND & AIMS Previous studies reported a severely impaired energy balance in COPD patients during the first days of an acute exacerbation , mainly due to a decreased energy and protein intake . The aim of the study was to investigate the feasibility and effectiveness of energy- and protein-rich nutritional supplements during hospitalization for an acute exacerbation in nutritionally depleted COPD patients . METHODS In a r and omized double-blind , placebo-controlled two-center trial , 56 COPD patients were r and omized and 47 patients completed the study . Nutritional intervention consisted of 3 x 125 ml ( 2.38 MJ/day ) and the placebo group received similar amounts of a non-caloric fluid . Medical therapy and dietetic consultation were st and ardized and dietary intake was measured daily . Body composition , respiratory and skeletal muscle strength , lung function and symptoms were measured on admission and on days 4 and 8 of hospitalization . RESULTS Forty-seven percent of the patients had experienced recent involuntary weight loss prior to admission . The degree of weight loss was inversely related to resting arterial oxygen tension ( r = 0.31 ; P < 0.05 ) . Nutritional intervention result ed in a significant increase in energy ( 16 % vs. placebo ) and protein intake ( 38 % vs. placebo ) . Mean duration of hospitalization was 9 + /- 2 days . Relative to usual care , no additional improvements in lung function or muscle strength were seen after nutritional intervention . CONCLUSIONS Oral nutritional supplementation during hospitalization for an acute exacerbation is feasible in nutritionally depleted COPD patients and does not interfere with normal dietary intake BACKGROUND : High calorie intakes , especially as carbohydrate , increase carbon dioxide production ( VCO2 ) and may precipitate respiratory failure in patients with severe pulmonary disease . Energy obtained from fat results in less carbon dioxide and thus may permit a reduced level of alveolar ventilation for any given arterial blood carbon dioxide tension ( PaCO2 ) . METHODS : Ten patients with stable severe chronic obstructive lung disease underwent a six minute walk before and 45 minutes after taking 920 kcal of a fat rich drink , an isocalorific amount of a carbohydrate rich drink , and an equal volume of a non-calorific control liquid on three separate days , in a double blind r and omised crossover study . Borg scores of the perceived effort to breathe were measured at the beginning and end of each six minute walk . Minute ventilation ( VE2 ) , VCO2 , oxygen consumption ( VO2 ) , respiratory quotient ( RQ ) , arterial blood gas tensions , and lung function were measured before and 30 minutes after each test drink . RESULTS : Baseline measurements were similar on all three test days and the non-calorific control drink result ed in no changes in any of the measured variables . The carbohydrate rich drink result ed in significantly greater increases in VE , VCO2 , VO2 , RQ , PaCO2 , and Borg score and a greater fall in the distance walked in six minutes than the fat rich drink ( mean fall after carbohydrate rich drink 17 m v 3 m after fat rich drink and the non-calorific control ) . The increase in VCO2 correlated significantly with the decrease in six minute walking distance and the increase in Borg score after the carbohydrate rich drink . The only significant change after the fat rich drink when compared with the non-calorific control was an increase in VCO2 . CONCLUSIONS : Comparatively small changes in the carbohydrate and fat constitution of meals can have a significant effect on VCO2 , exercise tolerance , and breathlessness in patients with chronic obstructive lung disease Background : Skeletal muscle wasting and dysfunction are strong independent predictors of mortality in patients with chronic obstructive pulmonary disease ( COPD ) . Creatine nutritional supplementation produces increased muscle mass and exercise performance in health . A controlled study was performed to look for similar effects in 38 patients with COPD . Methods : Thirty eight patients with COPD ( mean ( SD ) forced expiratory volume in 1 second ( FEV1 ) 46 (15)% predicted ) were r and omised to receive placebo ( glucose polymer 40.7 g ) or creatine ( creatine monohydrate 5.7 g , glucose 35 g ) supplements in a double blind trial . After 2 weeks loading ( one dose three times daily ) , patients participated in an outpatient pulmonary rehabilitation programme combined with maintenance ( once daily ) supplementation . Pulmonary function , body composition , and exercise performance ( peripheral muscle strength and endurance , shuttle walking , cycle ergometry ) took place at baseline ( n = 38 ) , post loading ( n = 36 ) , and post rehabilitation ( n = 25 ) . Results : No difference was found in whole body exercise performance between the groups : for example , incremental shuttle walk distance mean −23.1 m ( 95 % CI −71.7 to 25.5 ) post loading and −21.5 m ( 95 % CI −90.6 to 47.7 ) post rehabilitation . Creatine increased fat-free mass by 1.09 kg ( 95 % CI 0.43 to 1.74 ) post loading and 1.62 kg ( 95 % CI 0.47 to 2.77 ) post rehabilitation . Peripheral muscle performance improved : knee extensor strength 4.2 N.m ( 95 % CI 1.4 to 7.1 ) and endurance 411.1 J ( 95 % CI 129.9 to 692.4 ) post loading , knee extensor strength 7.3 N.m ( 95 % CI 0.69 to 13.92 ) and endurance 854.3 J ( 95 % CI 131.3 to 1577.4 ) post rehabilitation . Creatine improved health status between baseline and post rehabilitation ( St George ’s Respiratory Question naire total score −7.7 ( 95 % CI −14.9 to −0.5 ) ) . Conclusions : Creatine supplementation led to increases in fat-free mass , peripheral muscle strength and endurance , health status , but not exercise capacity . Creatine may constitute a new ergogenic treatment in COPD RATIONALE Nutritional depletion is a common problem in chronic obstructive pulmonary disease ( COPD ) patients . It is caused , to a large extent , by an imbalance between low-energy intake and high-energy requirements . This problem adversely affects morbidity and mortality . However , the use of nutritional supplements to reach their energy necessities requires optimisation between positive and adverse effects on outcome before being used systematic ally as part of their comprehensive care . PURPOSE The aim of our study was to investigate the effects of oral nutritional repletion on quality of life in stable COPD patients . METHODS Prospect i ve , r and omised and multi-centre study . Stable COPD patients with a body mass index 22 , a fat-free mass index 16 , and /or a recent involuntary weight loss ( 5 % during last month , or 10 % during the last 3 months ) were studied . Exclusion criteria were to present signs of an airway infection , to have a cardiovascular , neurological , or endocrine disease , to be treated with oral steroids , immunosuppressors or oxygen therapy at home , and to receive nutritional supplements . During 12 weeks , patients were encouraged to ingest a total daily defined energy intake . R and omly , in patients from group A the total daily energy load was Resting Energy Expenditure (REE)x1.7 , and those from group B , REE x1.3 . Total daily energy intake was achieved with regular food plus , if necessary , oral nutritional supplement rich in proteins ( with 50 % of whey protein ) , with predominance of carbohydrates over fat , and enriched in antioxidants . Primary end-point variable was quality of life . Secondary end-point outcomes included body weight , body composition , lung function , h and grip strength , and compliance with the energy intake previously planned . Data were treated with a SAS System . Student 's test , Wilcoxon 's rank sum test , and Mann-Whitney 's test were used . RESULTS At baseline both groups of patients were comparable . All patients needed oral nutritional supplements to achieve total daily defined energy intake . After 12 weeks of follow-up , patients in both groups significantly increased energy intake . Patients in group A increased body weight ( P=0.001 ) , triceps skin fold thickness ( P=0.009 ) and body fat mass ( P=0.02 ) , and decreased body fat-free mass index ( P=0.02 ) . In this group a marked increase in airflow limitation was observed . A tendency to increase body weight and h and grip strength , and to decrease airflow limitation was observed in patients from group B. Furthermore , patients in the later group showed a significant improvement in the feeling of control over the disease ( P=0.007 ) and a tendency to better the other criteria in a quality of life scale . CONCLUSIONS According to our results , total daily energy intake of REE x 1.3 is preferable to REE x 1.7 in mild stable COPD patients . The administration of oral nutritional supplements , rich in proteins ( with 50 % of whey protein ) , with predominance of carbohydrates over fat , and enriched in antioxidants , to achieve total daily defined energy intake in patients in group B was followed by a significant improvement of one criteria ( mastery ) among many others in a quality of life scale UNLABELLED Dietary intervention studies in COPD patients often are short-term inpatient studies where a certain amount of extra energy is guaranteed . The aim of this study was to evaluate the effect of an 1 year individual multifaceted dietary intervention during multidisciplinary rehabilitation . Eighty-seven patients with severe COPD , not dem and ing oxygen therapy were included , 24 of them served as controls . A dietary history interview was performed at baseline and at study end . Dietary advice given were based on results from the dietary history and socio-economic status . The intervention group was divided into three parts ; NW : normal weight ( dietary advice given aim ing to weight maintenance ) , OW : overweight ( weight-reducing advice ) and UW : underweight ( dietary advise based on an energy- and protein-rich diet ) . RESULTS UW-group : Eighty-one per cent of the patients gained weight or kept a stable weight . OW-group : Fifty-seven per cent lost more than 2 kg NW-group : Seventy-six per cent kept a stable weight or gained weight . Increased dietary intake from baseline was seen for energy protein , carbohydrates and certain micronutrients ( P < 0.05 ) in the UW group . Six minutes walking distance increased by approximately 20 m in both NW ( P < 0.05 ) and UW patients . To conclude , slight , but uniform , indications of positive effects of dietary intervention during multidisciplinary rehabilitation was seen . Dietary intervention in underweight COPD patients might be a prerequisite for physical training The prevalence and features of malnutrition in COPD patients have been studied extensively in stable conditions but are poorly defined in the presence of acute respiratory failure ( ARF ) . Nutritional status was prospect ively assessed , on hospital admission , in 50 consecutive COPD patients presenting with ARF , 27 of them requiring mechanical ventilation ( MV ) . Malnutrition , defined on a multiparameter nutritional index , was observed in 60 percent ( 30/50 ) of all patients , and in 39 percent ( 13/33 ) of those whose body weight was equal to or above 90 percent ideal body weight ( IBW ) . Malnutrition was more frequent in those patients who required MV than in those who did not ( 74 percent vs 43 percent , p < 0.05 ) . Subcutaneous fat stores were decreased ( triceps skinfold thickness [ TSF ] < 80 percent pred ) in 68 percent of patients , and markedly depleted ( TSF < 60 percent pred ) in 52 percent of them . The indices of lean body mass , ie , mid-arm muscle circumference ( MAMC ) and creatinine height index ( CHI ) were decreased in , respectively , 42 percent and 71 percent of patients , but MAMC was severely depressed ( < 60 percent pred ) in only 6 percent of them . A severe decrease of prealbumin ( < 100 mg/L ) , retinol-binding-protein ( < 20 mg/L ) , and albumin ( < 20 g/L ) serum concentrations was observed in , respectively , 22 percent , 28 percent , and 4 percent of patients . These results suggest that an assessment of nutritional status using a multiparameter approach should be systematic ally performed in COPD patients with ARF , especially in those requiring MV , as malnutrition may have deleterious effects on weaning off MV In a r and omized , double-blinded study , patients with chronic obstructive pulmonary disease and hypercapnia were fed low , moderate , and high carbohydrate diets to determine the effect on metabolic and ventilatory values . The low carbohydrate diet consisted of 28 % carbohydrate calories and 55 % fat calories and result ed in significantly lower production of CO2 ( p less than 0.002 ) , respiratory quotient ( p less than 0.001 ) , and arterial Pco2 ( p less than 0.05 ) . At the end of the 15-day study , both the forced vital capacity ( p less than 0.05 ) and the forced expiratory volume in 1 second ( p less than 0.05 ) had improved by 22 % over baseline values . Total calories given surpassed daily caloric requirements . This approach , together with a low carbohydrate , high fat mixture , may be beneficial for such patients STUDY OBJECTIVE To evaluate the influence of oral anabolic steroids on body mass index ( BMI ) , lean body mass , anthropometric measures , respiratory muscle strength , and functional exercise capacity among subjects with COPD . DESIGN Prospect i ve , r and omized , controlled , double-blind study . SETTING Pulmonary rehabilitation program . PARTICIPANTS Twenty-three undernourished male COPD patients in whom BMI was below 20 kg/m2 and the maximal inspiratory pressure ( PImax ) was below 60 % of the predicted value . INTERVENTION The study group received 250 mg of testosterone i.m . at baseline and 12 mg of oral stanozolol a day for 27 weeks , during which time the control group received placebo . Both groups participated in inspiratory muscle exercises during weeks 9 to 27 and cycle ergometer exercises during weeks 18 to 27 . MEASUREMENTS AND RESULTS Seventeen of 23 subjects completed the study . Weight increased in nine of 10 subjects who received anabolic steroids ( mean , + 1.8+/-0.5 kg ; p<0.05 ) , whereas the control group lost weight ( -0.4+/-0.2 kg ) . The study group 's increase in BMI differed significantly from that of the control group from weeks 3 to 27 ( p<0.05 ) . Lean body mass increased in the study group at weeks 9 and 18 ( p<0.05 ) . Arm muscle circumference and thigh circumference also differed between groups ( p<0.05 ) . Changes in PImax ( study group , 41 % ; control group , 20 % ) were not statistically significant . No changes in the 6-min walk distance or in maximal exercise capacity were identified in either group . CONCLUSION The administration of oral anabolic steroids for 27 weeks to malnourished male subjects with COPD was free of clinical or biochemical side effects . It was associated with increases in BMI , lean body mass , and anthropometric measures of arm and thigh circumference , with no significant changes in endurance exercise capacity The association between severe nutritional depletion and chronic obstructive pulmonary disease ( COPD ) has long been recognized . A potential therapeutic benefit to nutritional support was previously suggested by us in a pilot investigation . Subsequent studies have reported conflicting results regarding the role of nutritional therapy in this clinical population . We report a r and omized controlled study of nutritional therapy in underweight patients with COPD that combines an initial inpatient investigation ( controlled nutritional support ) with a prolonged outpatient follow-up interval . Provision of adequate calorie and protein support , adjusted to metabolic requirements , result ed in weight gain ( intervention = + 2.4 kg versus control -0.5 kg ) , improved h and grip strength ( intervention = + 5.5 kg-force versus control -6.0 kg-force ) , expiratory muscle strength ( intervention = + 14.9 cm H2O versus control -9.2 cm H2O ) , and walking distance ( intervention = + 429 feet versus control -1.0 foot ) . Inspiratory muscle strength was also improved ( intervention = + 11.4 cm H2O versus control + 4.8 cm H2O ) although this did not quite reach statistical significance . We conclude that provision of adequate nutrient supply under controlled conditions results in significant clinical improvements in the COPD patient population . However , the intervention is costly , time-intensive , and of limited therapeutic magnitude . More detailed work of alternative outpatient strategies combined with additional rehabilitative measures is indicated to delineate the full therapeutic potential of nutritional support for this clinical population Background : Undernutrition in hospitalized patients is often not recognized and nutritional support neglected . Chronic obstructive pulmonary disease is frequently characterized by weight loss . No data exist on the effects of nutritional supplementation in underweight lung transplantation c and i date s during hospitalization . Objective : To evaluate the effects on energy intake and body weight of an intensified nutritional support compared to the regular support during hospitalization . Methods : The participants were underweight ( n = 42 ) and normal-weight ( n = 29 ) patients with end-stage pulmonary disease assessed for lung transplantation . The underweight patients were r and omized to receive either an energy-rich diet planned for 10 MJ/day and 45–50 energy percentage fat and offered supplements ( group 1 ) , or the normal hospital diet planned for 8.5–9 MJ/day and 30–35 energy percentage fat and regular support ( group 2 , control group ) . The normal-weight control patients ( group 3 ) received the normal diet . Food intake was recorded for 3 days . Results : During a mean hospital stay of 12 days , the energy intake was significantly greater for the patients on intensified nutritional support ( median 11.2 MJ ) than for the underweight patients on the regular support ( 8.4 MJ ; p < 0.02 ) and the normal-weight patients ( 7.0 MJ ; p < 0.001 ) . The increase in energy intake in group 1 result ed in a significant weight gain ( median 1.2 kg ) compared with group 2 ( p < 0.01 ) and group 3 ( p < 0.001 ) . Conclusions : In a group of underweight patients with lung disease assessed for lung transplantation , it was possible to increase energy intake by an intensified nutritional support which was associated with a significant weight gain , compared to the regular nutritional support during a short hospital stay STATE OF THE ART The IRAD2 trial is evaluating a 3-month home intervention which includes education , oral supplements , exercise and and rogenic steroids in undernourished patients with chronic respiratory failure . The main objective is to increase the six-minute walking distance by more than 50 m with an improvement in health-related quality -of-life . Secondary end-points include a reduction in exacerbation rates by 25 % , a reduction in health-related costs and an increase in survival during the year following intervention . MATERIAL AND METHODS This interventional , multi-centre , prospect i ve , two-armed parallel , controlled trial is being conducted in 200 patients . In both groups , " Control " and " Rehabilitation " , 7 home visits are scheduled during the 3-month intervention for education purpose . In the " Rehabilitation " group , patients will receive 160 mg/d of oral testosterone undecanoate in men , 80 mg/d in women , oral dietary supplements ( 563 kcal/d ) and exercises on an ergometric bicycle 3 to 5 times a week . EXPECTED RESULTS In the event of significant responses to intervention , this trial would vali date a comprehensive and global home-care for undernourished patients with chronic respiratory failure combining therapeutic education , oral supplements , and rogenic substitution and physical activity Patients with chronic obstructive pulmonary disease ( COPD ) often suffer from weight loss . The aim of the present study was to gain insight into the energy balance of depleted ambulatory COPD patients , in relation to their habitual level of physical activity and consumption of oral nutritional supplements . Clinical ly stable and weight-stable patients ( n 20 ; BMI 19.8+/- SD 2.0 kg/m2 ) were studied 1 and 3 months after rehabilitation or recovery in the clinic and were at r and om assigned to a control or intervention group with regard to nutritional supplementation . Energy intake was measured with a 7 d food record . Energy expenditure was estimated from a simultaneous 7 d assessment of physical activity with a tri-axial accelerometer for movement registration in combination with measured BMR . Body mass was measured at several time points . The body mass remained stable in both groups after 1 or 3 months and mean energy balances were comparable for both groups . The mean body-mass change between month 1 and 3 was negatively related to the mean physical activity level ( r -0.49 ; P=0.03 ) . Weight change over the 3 months was negatively associated with the physical activity level . These results suggest that knowledge about the individual physical activity level is necessary for the estimation of the energy need of the COPD patient We examined the effect of nutritional supplementation for 13 wk on anthropometric , pulmonary function , and immunological status in malnourished ambulant patients with pulmonary emphysema ( EP ) . The study was placebo controlled , r and omized and double blind . Twenty-eight patients were included . Thirteen patients in the fed group were provided with a nutritional formula providing 20 % protein , 30 % fat , and 50 % carbohydrate , 1 Kcal/ml , 400 ml/day . The control group was provided with a reference product of the same consistency and taste containing 0.1 Kcal/ml , 400 ml/day for 13 wk . The fed group had a mean weight gain of 1.5 kg during the study period , the control group increased concomitantly 0.16 kg , the difference being significant ( p less than 0.01 ) . Sum of four skinfolds increased 2.7 mm in the fed group , and decreased 0.9 mm in the control group the difference being significant ( p less than 0.01 ) . No difference were observed regarding pulmonary function or immunological status . We also found a high habitual energy intake in our study group ( 204 % Basal Energy Expenditure ) . We conclude that nutritional supplementation produce weight gain in malnourished patients with EP , but it does not change other indices of well-being Twenty five patients with advanced chronic obstructive pulmonary disease ( COPD ) having malnutrition were r and omly assigned to parenteral nutrition ( PN ) group ( 12 cases ) and control group ( 13 cases ) . Patients in the PN group received 10 % Intralipid and 5 % Nutrisol-S. Body weight , blood lymphocyte , serum albumin , transferrin , prealbumin , fibronectin and serum free fatty acids ( FFA ) and serum free amino acids ( AA ) were monitored . The results showed that parenteral nutritional support for 10 - 20 days improved the nutritional status , significantly increased the body weight and the serum albumin , prealbumin , fibronectin , transferrin ( P < 0.05 or P < 0.01 ) and some kinds of serum free amino acids : PaCO2 also decreased . Serum FFA remained under the normal level whether nutritional support was given ( P < 0.01 ) . It is suggested that nutritional support may promote synthesis of protein in COPD patients with malnutrition . Intralipid infusion would preserve nitrogen and diminish the carbohydrate metabolism and might be beneficial to correct the hypercapnea in COPD STUDY OBJECTIVES To assess the effect of megestrol acetate ( MA ) , a progestational appetite stimulant commonly used in patients with AIDS and cancer , on body weight and composition , respiratory muscle strength , arterial blood gas levels , and subjective perceptions in COPD patients . DESIGN AND SETTING Prospect i ve , double-blind , r and omized , placebo-controlled trial conducted on an outpatient basis at 18 sites . PATIENTS Underweight ( < 95 % ideal body weight ) COPD patients > or = 40 years old . INTERVENTIONS Either MA , 800 mg/d oral suspension , or placebo at a 1:1 ratio for 8 weeks . RESULTS Of 145 r and omized patients ( 63 % men ) , 128 patients completed the trial . Body weight increased by 3.2 kg in the MA group and 0.7 kg in the placebo group ( p < 0.001 ) . Anthropometric and dual-energy radiograph absorptiometry assessment s confirmed that weight gain was mainly fat . Spirometry and maximal voluntary ventilation showed no significant changes from baseline in either group , and the difference in the change in maximum inspiratory pressure between groups was not significant . The 6-min walk distances did not differ statistically between groups at week 2 and week 4 , but were greater in the placebo group at week 8 ( p = 0.012 ) . Consistent with the known ability of MA to stimulate ventilation , PaCO(2 ) decreased ( 4.6 mm Hg , p < 0.001 ) and PaO(2 ) increased ( 2.8 mm Hg , p < 0.04 ) in the MA group . Question naires revealed that body image and appetite improved in the MA group but not the placebo group . Adverse event frequency and type were similar in both groups , but cortisol and testosterone ( in men ) levels decreased substantially in the MA group . CONCLUSIONS We conclude that MA safely increased appetite and body weight , stimulated ventilation , and improved body image in underweight COPD patients , but did not improve respiratory muscle function or exercise tolerance We carried out a prospect i ve r and omized controlled trial to investigate the effects of short-term refeeding ( 16 days ) in 10 malnourished in patients with chronic obstructive pulmonary disease ( COPD ) . Six patients were r and omized to receive sufficient nasoenterically administered calories to provide a total caloric intake equal to 1,000 kcal above their usual intake . The other four patients were sham fed , receiving only 100 kcal more . Measurements of nutritional status , respiratory muscle strength and endurance , adductor pollicis function , and pulmonary function were performed initially and at study end . The refed group gained significantly more weight and showed significant increases in maximal expiratory pressure and mean sustained inspiratory pressure . There were no significant changes in the maximal inspiratory pressure or in adductor pollicis function . In malnourished in patients with COPD , short-term refeeding leads to improvement in respiratory muscle endurance and in some parameters of respiratory muscle strength in the absence of demonstrable changes in peripheral muscle function We have measured caloric intake , energy expenditure , and the thermogenic effect of food in ten patients with stable COPD who had a history of involuntary weight loss over several years and were malnourished ( < 85 percent ideal body weight ) . Each patient completed a 7-day food record . Indirect calorimetry was performed in the resting postabsorptive state . After placement of a nasoenteric tube , patients were r and omly assigned to be refed or sham-fed ( mean + /- SD , 16 + /- 3 days ) , following which , metabolic measurements were repeated . Indirect calorimetry was also performed before and after a large meal in each patient . Home caloric intake was 135 + /- 23 percent of resting energy expenditure . Resting energy expenditure was 94 + /- 16 percent of that predicted by the Harris-Benedict equation and did not change significantly during inpatient refeeding . Refeeding result ed in weight gain ( 2.4 + /- 1.9 kg , p < 0.02 ) . A large meal caused substantial increases in energy expenditure ( 24 + /- 18 percent ) , carbon dioxide production ( 39 + /- 18 percent ) , and oxygen consumption ( 23 + /- 16 percent ) . We conclude that stable malnourished COPD patients consume adequate calories to meet average energy requirements and are not hypermetabolic . Inpatient refeeding by nocturnal nasoenteric infusion is well tolerated and results in weight gain , but the thermogenic effect of a large meal poses a considerable metabolic and ventilatory load that could precipitate acute respiratory failure Eight malnourished patients with emphysema ( EMPH ) and eight malnourished patients without evidence of lung disease ( MLAN ) received an infusion of 5 % dextrose plus electrolytes ( D5W ) for 48 h and were then r and omly assigned to a hypercaloric diet with either 53 % of the calories as carbohydrate ( CB ) or with 55 % as fat ( FB ) for the 1st wk , maintaining a constant protein intake . The alternate diet was given the following week . Ventilation and gas exchange were measured during supine cycle ergometry at 0 , 12 , and 25 W during the D5W , CB , and FB diet periods . At each exercise intensity , the EMPH group demonstrated a 12 - 15 % greater O2 consumption , a lower respiratory quotient , and an O2 debt larger than that of the MALN group . Resting ventilation was higher during the CB than FB regimen in both groups of patients , but during the CB diet the EMPH group had a more exaggerated ventilatory response than the MALN group . The results demonstrate that EMPH patients have an unusual metabolic pattern during hypercaloric feeding and exercise . Furthermore in EMPH patients a FB regimen does not appear to create the additional stress on the respiratory system during exercise that is generated with a CB regimen The purpose of this study was to examine the impact of nutritional support on nitrogen-energy relationships and functional parameters in malnourished patients with emphysema . Malnourished patients without lung disease served as the control group . Ten ambulatory , stable patients with emphysema and six patients without lung disease received an infusion of 5 % dextrose ( baseline ) plus electrolytes ( D5W ) for two days , which was followed by an enteral or a parenteral infusion of either a carbohydrate-based ( CB , 53 % carbohydrate ) or a fat-based diet ( FB , 55 % fat ) for 1 wk each , in a r and omized cross-over design . All patients had greater than 10 % weight loss . Caloric intake was set at 1.7 times the resting energy expenditure ( REE ) as measured during the baseline period . The REE of patients with emphysema was 23 and 27 % above that of the control group during baseline and refeeding periods , respectively . The increased REE was met primarily by an increased carbohydrate oxidation . During the infusion of D5W , N balance was lower in patients with emphysema , but during repletion N balance was similar in both groups of patients . Two weeks of nutritional support with either a CB or a FB diet increased body weight , N balance , and arm muscle area and improved maximal inspiratory pressure , skeletal muscle strength , and endurance-strength ( using quadriceps , hamstring , and h and grip ) to a similar degree in malnourished patients with and without lung disease . In other stress states , such as infection , it has been shown that hypermetabolism , hypercatabolism , and preferential fat oxidation occur concomitantly . Patients with emphysema are unusual because , although they are hypermetabolic , they are not hypercatabolic and do not demonstrate preferential fat oxidation The purpose of this study was to investigate body composition in patients with chronic obstructive pulmonary disease , and its relation to pulmonary function . Seventeen men with pulmonary emphysema who were being treated as out patients were divided into three groups , according to ideal body weight ( IBW ) : group A , % IBW > or = 90 % ; group B , 90 > % IBM > or = 80 ; and group C , % IBW < 80 . All underwent body composition analysis by dual energy X-ray absorptiometry . Fat mass and bone mineral content were significantly lower in groups B and C than in control subjects . Lean mass was significantly lower in group C than in control subjects . By contrast , group A did not differ significantly from control subjects . Lean mass correlated significantly with % VC , FEV1 , RV/TLC , and MVV . These data suggest that lean mass is low in moderately and severely malnourished patients , that bone mineral content and fat mass are low in mildly malnourished patients , and that abnormal body composition is associated with ventilatory impairment in patients with chronic obstructive pulmonary disease We examined the effect of nutritional supplementation for 8 wk on respiratory muscle function ( RMF ) in 21 malnourished patients with COPD . Patients were r and omized to a fed group or to a control group . Patients in the fed group were provided with an enteral formula in addition to their usual diet . Daily calorie and protein intake and weekly anthropometric measures were made . Pulmonary function tests were measured on Weeks 1 , 4 , and 8 . Respiratory muscle strength was measured by means of maximal inspiratory and expiratory pressures ( MIP ) , ( MEP ) , and respiratory muscle endurance was measured by the maximal sustained ventilatory capacity ( MSVC ) . The mean weight of the fed group increased from 52.2 + /- 6.4 to 53.3 + /- 6.9 kg ( NS ) . The mean daily caloric intake of the fed group was significantly increased during the study ( p less than 0.02 ) . The mean calorie intake during the study of the fed group was 174 + /- 17 % of the estimated basal energy expenditure . During the study period , no change was observed in anthropometric measures , pulmonary function studies , or RMF . Because patients tend to decrease their own food intake while receiving enteral formulas , it is difficult to provide sufficient calories and protein needed to effect changes in nutritional status and RMF in an outpatient COPD population . In addition , we compared RMF in 12 poorly nourished male patients ( 87.6 + /- 6.1 % of ideal body weight ) and 13 well-nourished male patients with severe COPD . Both groups had comparable degrees of air-flow limitation and hyperinflation . No difference was noted between the groups in either MIP , MEP , or MSVC . ( ABSTRACT TRUNCATED AT 250 WORDS Patients with chronic obstructive pulmonary disease ( COPD ) often develop weight loss , which is associated with increased mortality . Recombinant human growth hormone ( rhGH ) treatment has been proposed to improve nitrogen balance and to increase muscle strength in these patients . The aim of this study was to assess the effects of rhGH administration on the nutritional status , resting metabolism , muscle strength , exercise tolerance , dyspnea , and subjective well-being of underweight patients with stable COPD . Sixteen patients attending a pulmonary rehabilitation program ( age : 66 + /- 9 yr ; weight : 77 + /- 7 % of ideal body weight ; FEV1 : 39 + /- 13 % of predicted ) were r and omly treated daily with either 0.15 IU/kg rhGH or placebo during 3 wk in a double-blind fashion . Measurements were made at the beginning ( DO ) and at the end ( D21 ) of treatment and 2 mo later ( D81 ) . Body weight was similar in the two groups during the study , but lean body mass was significantly higher in the rhGH group at D21 ( p < 0.01 ) and D81 ( p < 0.05 ) . The increase in lean body mass was 2.3 + /- 1.6 kg in the rhGH group and 1.1 + /- 0.9 kg in the control group at D21 and 1.9 + /- 1.6 kg in the rhGH group and 0.7 + /- 2.1 kg in the control group at D81 . At D21 , the resting energy expenditure was increased in the rhGH group ( 107.8 % of DO , p < 0.001 compared with the control group ) . At D21 and D81 , the changes in maximal respiratory pressures , h and grip strength , maximal exercise capacity , and subjective well-being were similar in the two groups . At D21 , the 6-min walking distance decreased in the rhGH group ( -13 + /- 31 % ) and increased in the control group ( + 10 + /- 14 % ; p < 0.01 ) . We conclude that the daily administration of 0.15 IU/kg rhGH during 3 wk increases lean body mass but does not improve muscle strength or exercise tolerance in underweight patients with COPD We carried out a prospect i ve , r and omized , controlled trial to investigate the effect of a 3-month period of supplementary oral nutrition in 14 poorly nourished out patients with COPD . Seven patients were r and omized into Group 1 who received their normal diet during Months 1 to 3 , a supplemented diet during Months 4 to 6 , and their original normal diet during Months 7 to 9 . The other 7 patients received their normal diet for the entire 9-month study period ( Group 2 ) . Seven well-nourished patients ( Group 3 ) matched for age and severity of air-flow obstruction served as control subjects ; they received their normal diet for the 9-month study period . Measurements of nutritional status , respiratory muscle and h and grip strength , sternomastoid muscle function ( including frequency/force curves , maximal relaxation rate , and a fatigability test ) , lung function , arterial blood gas tensions , general well-being and breathlessness scores , and 6-min walking distances were carried out monthly in all patients . At the start of the study , the poorly nourished patients had lower mean daily calorie and protein intakes than did the well-nourished patients . The poorly nourished patients had lower respiratory muscle and h and grip strength , and abnormal contractility and increased fatigability of the sternomastoid muscle compared with those in the well-nourished patients . After 3 months of supplementary oral nutrition , there was a significant improvement in the nutritional status of Group 1 patients , as evidence d by an increase in body weight , triceps skinfold thickness , and midarm muscle circumference . Respiratory muscle and h and grip strength increased in parallel with nutritional status , although there were no significant changes in lung function or arterial blood gas tensions . ( ABSTRACT TRUNCATED AT 250 WORDS High fat enteral formulas have been advocated for the nutritional support of chronic obstructive pulmonary disease ( COPD ) patients because dietary fat utilization under ideal conditions produces less CO2 per O2 consumed than carbohydrate . No data exist for these patients comparing the effects of a moderate fat vs. a high fat enteral formula on gastric emptying times ( GE ) and subsequent CO2 production ( VCO2 ) , oxygen consumption ( VO2 ) , respiratory quotient ( RQ ) , and pulmonary function . Our double-blind crossover study compared these parameters after feeding a 355 mL ( 530 kcal ) meal with either 41 % fat calories ( Respalor ) or 55 % fat calories ( Pulmocare ) . Thirty-six COPD out patients with a forced expiratory volume in 1 s ( FEV1 ) < 60 % of predicted were studied after an overnight fast . Gastric emptying half-time ( GE t1/2 ) was measured using the 99MTc-radionuclide technique ; VCO2 , VO2 , RQ , and other pulmonary functions were measured at 0 , 30 , 90 , and 150 min postpr and ial using the Canopy Mode of the Deltatrac Metabolic Monitor and the Renaissance Spirometry System . We observed a significantly ( p = 0.0001 ) longer GE t1/2 of the high fat meal when compared to the moderate fat meal ( 134.1 vs. 108.6 min ) At 30 and 90 , but not at 150 min postpr and ial , the VCO2 and VO2 for patients fed the moderate-fat formula were significantly ( p = 0.05 ) higher than for those fed the high-fat formula ; no differences were observed for the other pulmonary functions . Although RQ increased significantly ( p = 0.01 ) after both meals , no differences between formulas were noted at all postpr and ial times tested . Compared to the high-fat meal , the moderate-fat meal significantly enhanced gastric emptying . The earlier rise in VCO2 and VO2 after the moderate-fat meal did not impact pulmonary function and reflected the earlier utilization of the moderate-fat meal . The fact that RQ was not different between the two meals at all postpr and ial times tested suggest that the higher rise in VCO2 and VO2 after the moderate-fat meal was most likely due to earlier gastric emptying of the moderate-fat meal rather than the difference of the fat-to-carbohydrate ratio between the two tested meals . The impact of these findings on long-term management of COPD patients awaits long-term prospect i ve studies Energy expenditure was studied in ten patients with chronic obstructive pulmonary disease ( COPD ) and weight loss , and in five malnourished patients without clinical evidence of COPD ( control group ) prior to and after a two-week refeeding regimen . Patients received 5 percent dextrose solution ( plus electrolytes ) for 36 hours to establish st and ard baseline conditions and were then r and omly assigned to either a carbohydrate-based ( CB ; 53 percent of calories ) or fat-based ( FB ; 55 percent of calories ) diet for the first week . The alternate diet was given the following week . Total calorie intake was set at 70 percent above the energy expenditure measured prior to institution of nutritional support . During energy repletion , energy expenditure was greater than predicted ( 116 percent ) in patients with COPD and less than predicted ( 90 percent ) in the control patients . Thermic effect of nutrients during administration of either regimen was significantly greater ( p less than .05 ) in patients with COPD than in those without COPD during both diets . The difference between the two groups was enhanced during the CB regimen . These observations suggest that malnourished patients with COPD have an elevated resting energy expenditure , and an enhanced thermic response to nutrients as compared to malnourished patients without COPD . Increased diet-induced thermogenesis may contribute to weight loss in patients with COPD , in addition to factors previously described such as decreased caloric intake and increased resting energy expenditure AIM To assess prospect ively the effects of a controlled program of inspiratory muscle training program and nutritional support in patients with chronic obstructive lung disease ( COPD ) . PATIENTS AND METHODS Twenty-three patients with COPD were r and omly assigned into four groups . Group I received a 1000 kcal/day nutritional supplement , given as a casein based enteral nutritional formula ; group III was subjected to inspiratory muscle training , using an inexpensive pressure threshold load valve constructed according to the Appropriate Technology principles of the WHO , adjusted at 30 % of Maximal Inspiratory Mouth Pressure and received also the nutritional supplement ; group IV was trained but did not receive the nutritional supplement and group II was not trained nor supplemented . Patients were studied during three months and monthly , inspiratory muscle function , exercise capacity and anthropometry were measured . RESULTS A significant improvement in exercise capacity , maximal inspiratory pressure and inspiratory muscle endurance was observed in the four groups throughout the study . Trained subjects had greater improvement in their inspiratory muscle endurance , compared to untrained subjects . Nutritional support had no effect in inspiratory muscle function or exercise capacity . No changes in anthropometric measures were observed . CONCLUSIONS The pressure threshold load valve used in this study , improved inspiratory muscle endurance and nutritional support had no effect in patients with COPD We studied the effects of oral nutritional supplementation on respiratory muscle ( RM ) performance in 25 ambulatory patients with severe chronic obstructive pulmonary disease ( COPD ) . There was a relationship between body weight and anthropometric parameters of nutritional status ( triceps skinfold thickness [ r = 0.67 ; p less than 0.005 ] , midarm muscle circumference ( r = 0.53 ; p less than 0.005 ) , but body weight did not correlate with daily caloric intake , serum albumin , transferrin , or blood lymphocyte count . None of these measurements of nutritional status correlated with any measure of RM strength or endurance . In a r and omized observer-blinded crossover trial , patients were allocated to one of two groups . In the first eight weeks of the study , group A received nutritional supplementation , and patients in group B were control subjects . In the second eight weeks , patients in group A were control subjects , and group B received supplement . Mean daily caloric intake and body weight increased in both groups while receiving supplement ( both p less than 0.05 ) . Calories provided by the supplement were frequently substituted for normal dietary calories . Any increases in RM performance in the group receiving supplement were matched by increases ( due to learning ) in controls . We conclude that oral dietary supplements have no important effects on RM performance in ambulatory patients with COPD OBJECTIVE One of the goals in treating patients with chronic obstructive pulmonary disease ( COPD ) who suffer from hypoxemia , hypercapnia , and malnutrition is to correct the malnutrition without increasing the respiratory quotient and minimize the production of carbon dioxide . This 3-wk study evaluated the efficacy of feeding a high-fat , low-carbohydrate ( CHO ) nutritional supplement as opposed to a high-carbohydrate diet in COPD patients on parameters of pulmonary function . S METHODS : Sixty COPD patients with low body weight ( < 90 % ideal body weight ) were r and omized to the control group , which received dietary counseling for a high-CHO diet ( 15 % protein , 20 % to 30 % fat , and 60 % to 70 % CHO ) , or the experimental group , which received two to three cans ( 237 mL/can ) of a high-fat , low-CHO oral supplement ( 16.7 % protein , 55.1 % fat , and 28.2 % CHO ) in the evening as part of the diet . Measurements of lung function ( forced expiratory volume in 1 s or volume of air exhaled in 1 s of maximal expiration , minute ventilation , oxygen consumption per unit time , carbon dioxide production in unit time , and respiratory quotient ) and blood gases ( pH , arterial carbon dioxide tension , and arterial oxygen tension ) were taken at baseline and after 3 wk . RESULTS Lung function measurements decreased significantly and forced expiratory volume increased significantly in the experimental group . CONCLUSION This study demonstrates that pulmonary function in COPD patients can be significantly improved with a high-fat , low-CHO oral supplement as compared with the traditional high-CHO diet OBJECTIVE The study aim ed to compare the effectiveness of a defined formula diet with a blenderized diet on nutritional and respiratory function parameters and to determine the bacteriological load of the two formulations . METHODOLOGY Seventeen patients , aged 50 - 75 years , admitted to the University of the Philippines-Philippine General Hospital for chronic bronchitis and /or emphysema , were studied . They were divided into two groups according to dietary regimens . Each group of patients received either the st and ardized commercial formula or the blenderized formula for 2 weeks . Evaluation of dietary intake , anthropometric measurements , laboratory examinations and lung function were assessed . Subjective evaluation ( patient 's and physician 's assessment ) was also sought . Microbiological examinations were performed on the prepared enteral formulas . RESULTS There was a slight increase in weight and in pulmonary function in both groups but these results did not differ significantly . Possible formula contamination was confirmed . Furthermore , in the overall assessment , the physician and patients rated both formulas as comparable STUDY OBJECTIVES Skeletal muscle weakness commonly occurs in patients with COPD . Long-term use of systemic glucocorticosteroids further contributes to muscle weakness . Anabolic steroids could be an additional mode of intervention to improve outcome of pulmonary rehabilitation by increasing physiologic functioning , possibly mediated by increasing erythropoietic function . PATIENTS AND METHODS We r and omly assigned 63 male patients with COPD to receive on days 1 , 15 , 29 , and 43 a deep IM injection of 50 mg of n and rolone decanoate ( ND ) [ Deca-Durabolin ; N.V. Organon ; Oss , The Netherl and s ] in 1 mL of arachis oil , or 1 mL of arachis oil alone ( placebo ) in a double-blind design . All patients participated in a st and ardized pulmonary rehabilitation program . Outcome measures were body composition by deuterium and bromide dilution , respiratory and peripheral muscle function , incremental exercise testing , and health status by the St. George 's Respiratory Question naire . RESULTS Treatment with ND relative to placebo result ed in higher increases in fat-free mass ( FFM ; mean , 1.7 kg [ SD , 2.5 ] vs 0.3 kg [ SD , 1.9 ] ; p = 0.015 ) owing to a rise in intracellular mass ( mean , 1.8 kg [ SD , 3.1 ] vs - 0.5 kg [ SD , 3.1 ] ; p = 0.002 ) . Muscle function , exercise capacity , and health status improved in both groups to the same extent . Only after ND were increases in erythropoietic parameters seen ( erythropoietin : mean , 2.08 U/L [ SD , 5.56 ] , p = 0.067 ; hemoglobin : mean , 0.29 mmol/L [ SD , 0.73 ] , p = 0.055 ) . In the total group , the changes in maximal inspiratory mouth pressure ( PImax ) and peak workload were positively correlated with the change in hemoglobin ( r = 0.30 , p = 0.032 , and r = 0.34 , p = 0.016 , respectively ) , whereas the change in isokinetic leg work was correlated with the change in erythropoietin ( r = 0.38 , p = 0.013 ) . In the patients receiving maintenance treatment with low-dose oral glucocorticosteroids ( 31 of 63 patients ; mean , 7.5 mg/24 h [ SD , 2.4 ] ) , greater improvements in PImax ( mean , 6.0 cm H(2)O [ SD , 8.82 ] vs - 2.18 cm H(2)O [ SD , 11.08 ] , p = 0.046 ) , and peak workload ( mean , 20.47 W [ SD , 19.82 ] vs 4.80 W [ SD , 7.74 ] , p = 0.023 ) were seen after 8 weeks of treatment with ND vs placebo . CONCLUSIONS In conclusion , a short-term course of ND had an overall positive effect relative to placebo on FFM without exp and ing extracellular water in patients with COPD . In the total group , the improvements in muscle function and exercise capacity were associated with improvements in erythropoietic parameters . The use of low-dose oral glucocorticosteroids as maintenance medication significantly impaired the response to pulmonary rehabilitation with respect to respiratory muscle function and exercise capacity , which could be restored by ND treatment
11,076
27,536,797
Interventions design ed to improve the care transition from hospital to home are effective in reducing hospital readmission . These interventions preferably start in the hospital and continue after discharge rather than starting after discharge . Enhancing patient empowerment is a key factor in reducing hospital readmissions . Interventions that support patient empowerment are more effective in reducing hospital readmissions ( grade B recommendation ) .
BACKGROUND Many discharge interventions are developed to reduce unplanned hospital readmissions , but it is unclear which interventions are more effective . OBJECTIVES The objective of this review was to identify discharge interventions from hospital to home that reduce hospital readmissions within three months and to underst and their effect on secondary outcome measures .
BACKGROUND Case management is believed to promote continuity of care and decrease hospitalization rates , although few controlled trials have tested this approach . OBJECTIVE To assess the effectiveness of a st and ardized telephonic case-management intervention in decreasing re source use in patients with chronic heart failure . METHODS A r and omized controlled clinical trial was used to assess the effect of telephonic case management on re source use . Patients were identified at hospitalization and assigned to receive 6 months of intervention ( n = 130 ) or usual care ( n = 228 ) based on the group to which their physician was r and omized . Hospitalization rates , readmission rates , hospital days , days to first rehospitalization , multiple readmissions , emergency department visits , inpatient costs , outpatient re source use , and patient satisfaction were measured at 3 and 6 months . RESULTS The heart failure hospitalization rate was 45.7 % lower in the intervention group at 3 months ( P = .03 ) and 47.8 % lower at 6 months ( P = .01 ) . Heart failure hospital days ( P = .03 ) and multiple readmissions ( P = .03 ) were significantly lower in the intervention group at 6 months . Inpatient heart failure costs were 45.5 % lower at 6 months ( P = .04 ) . A cost saving was realized even after intervention costs were deducted . There was no evidence of cost shifting to the outpatient setting . Patient satisfaction with care was higher in the intervention group . CONCLUSIONS The reduction in hospitalizations , costs , and other re source use achieved using st and ardized telephonic case management in the early months after a heart failure admission is greater than that usually achieved with pharmaceutical therapy and comparable with other disease management approaches Background : There is still debate over the benefit of self-management programmes for adults with asthma . A brief self-management programme given during a hospital admission for acute asthma was tested to determine whether it would reduce readmission . Method : A r and omised controlled trial was performed in 280 adult patients with acute asthma admitted over 29 months . Patients on the self-management programme ( SMP ) received 40–60 minutes of education supporting a written self-management plan . Control patients received st and ard care ( SC ) . Results : One month after discharge SMP patients were more likely than SC patients to report no daytime wheeze ( OR 2.6 , 95 % CI 1.5 to 5.3 ) , no night disturbance ( OR 2.0 , 95 % CI 1.2 to 3.5 ) , and no activity limitation ( OR 1.5 , 95 % CI 0.9 to 2.7 ) . Over 12 months 17 % of SMP patients were re-admitted compared with 27 % of SC patients ( OR 0.5 , 95 % CI 0.3 to 1.0 ) . Among first admission patients , OR readmission ( SMP v SC ) was 0.2 ( 95 % CI 0.1 to 0.7 ) , p<0.01 . For patients with a previous admission , OR readmission was 0.8 ( 95 % CI 0.4 to 1.6 ) , p=0.6 . SMP patients were more likely than SC patients to be prescribed inhaled steroids at discharge ( 99 % v 92 % , p=0.03 ) , oral steroids ( 98 % v 90 % , p=0.06 ) , and to have hospital follow up ( 98 % v 84 % , p<0.01 ) but adjustment for these differences did not diminish the effect of the self-management programme . Conclusions : A brief self-management programme during hospital admission reduced post discharge morbidity and readmission for adult asthma patients . The benefit of the programme may have been greater for patients admitted for the first time . The programme also had a small but significant effect on medical management at discharge OBJECTIVES to investigate the effectiveness of a pharmacy discharge plan in elderly hospitalized patients . DESIGN r and omized controlled trial . SUBJECTS AND SETTING S we r and omized patients aged 75 years and older on four or more medicines who had been discharged from three acute general and one long-stay hospital to a pharmacy intervention or usual care . INTERVENTIONS the hospital pharmacist developed discharge plans which gave details of medication and support required by the patient . A copy was given to the patient and to all relevant professionals and carers . This was followed by a domiciliary assessment by a community pharmacist . In the control group , patients were discharged from hospital following st and ard procedures that included a discharge letter to the general practitioner listing current medications . OUTCOMES the primary outcome was re-admission to hospital within 6 months . Secondary outcomes included the number of deaths , attendance at hospital outpatient clinics and general practice and proportion of days in hospital over the follow-up period , together with patients ' general well-being , satisfaction with the service and knowledge of and adherence to prescribed medication . RESULTS we recruited 362 patients , of whom 181 were r and omized to each group . We collected hospital and general practice data on at least 91 and 72 % of patients respectively at each follow-up point and interviewed between 43 and 90 % of the study subjects . There were no significant differences between the groups in the proportion of patients re-admitted to hospital between baseline and 3 months or 3 and 6 months . There were no significant differences in any of the secondary outcomes . CONCLUSIONS we found no evidence to suggest that the co-ordinated hospital and community pharmacy care discharge plans in elderly patients in this study influence outcomes Background Pre-discharge home visits aim to maximise independence in the community . These visits involve assessment of a person in their own home prior to discharge from hospital , typically by an occupational therapist . The therapist may provide equipment , adapt the home environment and /or provide education . The aims of this study were to investigate the feasibility of a r and omised controlled trial in a clinical setting and the effect of pre-discharge home visits on functional performance in older people undergoing rehabilitation . Methods Ten patients participating in an inpatient rehabilitation program were r and omly assigned to receive either a pre-discharge home visit ( intervention ) , or st and ard practice in-hospital assessment and education ( control ) , both conducted by an occupational therapist . The pre-discharge home visit involved assessment of the older person 's function and environment , and education , and took an average of 1.5 hours . The hospital-based interview took an average of 40 minutes . Outcome data were collected by a blinded assessor at 0 , 2 , 4 , 8 and 12 weeks . Outcomes included performance of activities of daily living , reintegration to community living , quality of life , readmission and fall rates . Results Recruitment of 10 participants was slow and took three months . Observed performance of functional abilities did not differ between groups due to the small sample size . Difference in activities of daily living participation , as recorded by the Nottingham Extended Activities of Daily Living scale , was statistically significant but wide confidence intervals and low statistical power limit interpretation of results . Conclusion Evaluation of pre-discharge home visits by occupational therapists in a rehabilitation setting is feasible , but a more effective recruitment strategy for a main study is favored by application of a multi-centre setting AIMS To test the effect of education and support by a nurse on self-care and re source utilization in patients with heart failure . METHODS A total of 179 patients ( mean age 73 , 58 % male , NYHA III-IV ) hospitalized with heart failure were evaluated prospect ively . Patients were r and omized to the study intervention or to ' care as usual ' . The supportive educative intervention consisted of intensive , systematic and planned education by a study nurse about the consequences of heart failure in daily life , using a st and ard nursing care plan developed by the research ers for older patients with heart failure . Education and support took place during the hospital stay and at a home visit within a week of discharge . Data were collected on self-care abilities , self-care behaviour , readmissions , visits to the emergency heart centre and use of other health care re sources . RESULTS Education and support from a nurse in a hospital setting and at home significantly increases self-care behaviour in patients with heart failure . Patients from both the intervention and the control group increased their self-care behaviour within 1 month of discharge , but the increase in the intervention group was significantly more after 1 month . Although self-care behaviour in both groups decreased during the following 8 months , the increase from baseline remained statistically significant in the intervention group , but not in the control group . No significant effects on re source utilization were found . CONCLUSIONS Intensive , systematic , tailored and planned education and support by a nurse results in an increase in patients ' self-care behaviour . No significant effects were found on use of health care re sources . Additional organisational changes , such as longer follow-up and the availability of a heart failure specialist would probably enhance the effects of education and support To select patients for early discharge planning , a r and omized clinical trial evaluated a protocol that used risk factors identified upon hospital admission . The goal of the study was to determine if intervention with high-risk patients could reduce the need for hospital admission or skilled care . Of 13,255 patients screened , 835 study participants were identified as " at risk " for frequent health care re source use . Half of the high-risk patients were r and omly assigned to the experimental group ( n=417 ) and received discharge planning from day 3 of their hospital stay , while the control group ( n=418 ) received discharge planning only if there was a written physician request . Those patients receiving early , systematic discharge planning experienced an increased likelihood of successful return to home after hospital admission and a decreased chance of unscheduled readmission for the 9-month study period . Length of the index hospital stay was not affected by early planning , however . The major clinical implication is the potential for discharge planners to decrease the need for , and use of , health care re sources after hospital admission Background Readmission rates for patients with heart failure are a major concern for hospitals worldwide . The importance of patient education and a structured care plan to ease the transition from hospital to home has been the focus of many intervention strategies to reduce readmission rates . The use of transitioning of care plans is believed to improve medication reconciliation , communication , patient education , and follow‐up . To date , the evidence has not been systematic ally evaluated to support the effectiveness of a nurse coordinated transitioning of care for patients with heart failure in reducing readmission rates . Objective The objective of the systematic review was to identify the best available evidence on the effectiveness of nurse coordinated transitioning of care between hospital and home on hospital readmission rates for all causes in adult patients hospitalised with heart failure . Search strategy The search strategy aim ed to find both published and unpublished studies in the English language from January 1975 through July 2010 . A search of MEDLINE , CINAHL , PsycINFO , Health source Nursing/academic edition , EMBASE , the Cochrane Library , and the Joanna Briggs Institute Library of Systematic Review s was conducted followed by a reference search of relevant studies . The initial key words search ed were : heart failure , readmission , and transitional care . Inclusion criteria R and omised controlled trials that evaluated the effect of nurse coordinated transitioning of care from hospital to home in adult patients with heart failure on readmission rates were selected . The outcome was defined as hospital readmissions for all causes following an initial admission for heart failure . Data collection and analysis Studies selected for retrieval were critically evaluated by two independent review ers for method ological validity using st and ardised critical appraisal instruments from the Joanna Briggs Institute Meta Analysis of Statistics Assessment and Review Instrument ( JBI‐MAStARI ) . Data were extracted and analysed using the JBI‐MAStARI program . Results A total of 16 r and omised controlled studies were included . Ten of the 16 studies included in the review show that a nurse led transitioning of care intervention can reduce the rate of readmission for patients with heart failure . Interventions utilising home visits , or home visits coupled with telephone follow‐up , show a more favourable reduction in readmission rates . Conclusions Reduced readmissions occur when transitioning of care interventions are carried out by a heart failure trained nurse who conducts at least one home visit and follows the patient at least weekly for a minimum of 30 days post discharge with either additional home visits or telephone contact . Implication s for practice This review supports the development of a nurse coordinated transitioning of care plan which will require improvements in communication , in addition to changes in health policy and payment systems that align incentives and performance measures in caring for patients with heart failure . Implication s for research Future research should evaluate the effect of the intensity and duration of the transitioning of care intervention on readmission rates in a large r and omised control trial on an adult population with heart failure to determine the ideal frequency and duration of the post discharge interventions Background Hospital readmission soon after discharge is common and costly . To date , published studies of effectiveness of structured discharge process addressing reduction of hospital readmission have focused on patients with chronic conditions and complex needs , but not in adult patients with community acquired pneumonia . Objectives To examine and synthesis e the best available evidence related to effectiveness of structured discharge process in reducing hospital readmission of adult patients with community acquired pneumonia . Inclusion criteria Types of participants This review considered studies that included hospitalised adult patients diagnosed with community acquired pneumonia regardless of gender , ethnicity , severity , and co‐morbidities . Types of interventions Structured discharge process related to early patient engagement , patient‐caregiver dyad intervention , transitional care , coordinated care , and multidisciplinary team approach . Types of outcome measures The outcome measures included in this review were hospital readmission , emergency room visits , and unscheduled visits to healthcare provider . Types of Studies R and omised controlled trials ( RCTs ) and quasi‐experimental studies were considered for inclusion . Search strategy The search strategy aim ed to find both published and unpublished studies in English language without date limits . A search of PubMed / MEDLINE , CINAHL , CINAHL Plus , EMBASE , Cochrane Central Register of Controlled Trials ( CENTRAL ) , PsycINFO , Academic Search Premier , Health Source Nursing/Academic Edition and seven other data bases was conducted . Method ological quality Studies were critically appraised by two independent review ers using the Joanna Briggs Institute ’s st and ardised critical appraisal tool . Data extraction Data were extracted using the st and ardised Joanna Briggs Institute ’s data extraction instruments . Data synthesis Statistical pooling in meta‐ analysis was not appropriate . Findings are presented in a narrative form . Results Three articles were included in the review , two RCTs and one pseudo‐r and omised controlled clinical trial . Structured discharge process did not have a positive impact in reducing hospital readmission at 30 , 90 , and 180 days and in reducing emergency room visit at 30 days . The outcome measure of unscheduled visit to healthcare provider was not measured in any of the three studies . The incorporation of medication reconciliation with follow‐up telephone calls either by an advanced practice nurse , care coordinator , or a clinical pharmacist were effective strategies in reducing hospital readmission in all three studies and in reducing emergency room visits in one of the studies . Conclusions Medication reconciliation with the addition of follow‐up telephone calls and incorporation of either an advanced practice nurse , care coordinator , or a clinical pharmacist using a multidisciplinary team approach may have implication s in existing coordination of care of adult patients with community acquired pneumonia . Implication s for practice This review recommends use of medication reconciliation with follow‐up telephone calls either by an advanced practice nurse , care coordinator , or a clinical pharmacist as part of the structured discharge process in reducing hospital readmission of adult patients with community acquired pneumonia . Implication s for research Further research is needed in examining the effectiveness of structured discharge process in reducing hospital readmission of adult patients with community acquired pneumonia PURPOSE This work addresses the unanswered question of whether multidisciplinary care ( MDC ) of heart failure ( HF ) can reduce readmissions when optimal medical care is applied in both intervention and control groups . METHODS In a r and omized , controlled study , 98 patients ( mean age , 70.8 + /- 10.5 years ) admitted to hospital with left ventricular failure ( New York Heart Association Class IV ) were assigned to routine care ( RC , n = 47 ) or MDC ( n = 51 ) . All patients received the same components of inpatient , optimal medical care of HF : specialist-led inpatient care ; titration to maximum tolerated dose of angiotensin-converting enzyme inhibitor before discharge ; attainment of predetermined discharge criteria ( weight stable , off all intravenous therapy , and no change in oral regimen for 2 days ) . Only those in the MDC group received inpatient and outpatient education and close telephone and clinic follow-up . The primary study endpoint was rehospitalization or death for a HF-related issue at 3 months . MAIN FINDINGS At 3 months , four people had events in the MDC group ( 7.8 % rate over 3 months ) compared with 12 people ( 25.5 % rate over 3 months ) in the RC group ( P = 0.04 ) . CONCLUSION These data demonstrate for the first time the intrinsic benefit of MDC in the setting of protocol -driven , optimal medical management of HF . Moreover , the event rate of 7.8 % at 3 months , as the lowest reported rate for such a high-risk group , underlines the value of this approach to the management of heart failure OBJECTIVES To determine whether a new multimodal comprehensive discharge-planning intervention would reduce emergency rehospitalizations or emergency department ( ED ) visits for very old in patients . DESIGN Six-month prospect i ve , r and omized ( Zelen design ) , parallel-group , open-label trial . SETTING Six acute geriatric units ( AGUs ) in Paris and its surroundings . PARTICIPANTS Six hundred sixty-five consecutive in patients aged 70 and older ( intervention group ( IG ) n = 317 ; control group ( CG ) n = 348 ) . INTERVENTION Intervention-dedicated geriatricians different from those in the study centers implemented the intervention , which targeted three risk factors for preventable readmissions and consisted of three components : comprehensive chronic medication review , education on self-management of disease , and detailed transition-of-care communication with outpatient health professionals . MEASUREMENTS Emergency hospitalization or ED visit 3 and 6 months after discharge , as assessed by telephone calls to the participant , the caregiver , and the general practitioner and confirmed with the hospital administrative data base . RESULTS Twenty-three percent of IG participants were readmitted to hospital or had an ED visit 3 months after discharge , compared with 30.5 % of CG participants ( P = .03 ) ; at 6 months , the proportions were 35.3 % and 40.8 % , respectively ( P = .15 ) . Event-free survival was significantly higher in the IG at 3 months ( hazard ratio ( HR ) = 0.72 , 95 % confidence interval ( CI ) = 0.53 - 0.97 , P = .03 ) but not at 6 months ( HR = 0.81 , 95 % CI = 0.64 - 1.04 , P = .10 ) . CONCLUSION This intervention was effective in reducing rehospitalizations and ED visits for very elderly participants 3 but not 6 months after their discharge from the AGU . Future research should investigate the effect of this intervention of transitional care in a larger population and in usual acute and subacute geriatric care BACKGROUND Patients with complex care needs who require care across different health care setting s are vulnerable to experiencing serious quality problems . A care transitions intervention design ed to encourage patients and their caregivers to assert a more active role during care transitions may reduce rehospitalization rates . METHODS R and omized controlled trial . Between September 1 , 2002 , and August 31 , 2003 , patients were identified at the time of hospitalization and were r and omized to receive the intervention or usual care . The setting was a large integrated delivery system located in Colorado . Subjects ( N = 750 ) included community-dwelling adults 65 years or older admitted to the study hospital with 1 of 11 selected conditions . Intervention patients received ( 1 ) tools to promote cross-site communication , ( 2 ) encouragement to take a more active role in their care and to assert their preferences , and ( 3 ) continuity across setting s and guidance from a " transition coach . " Rates of rehospitalization were measured at 30 , 90 , and 180 days . RESULTS Intervention patients had lower rehospitalization rates at 30 days ( 8.3 vs 11.9 , P = .048 ) and at 90 days ( 16.7 vs 22.5 , P = .04 ) than control subjects . Intervention patients had lower rehospitalization rates for the same condition that precipitated the index hospitalization at 90 days ( 5.3 vs 9.8 , P = .04 ) and at 180 days ( 8.6 vs 13.9 , P = .046 ) than controls . The mean hospital costs were lower for intervention patients ( $ 2058 ) vs controls ( $ 2546 ) at 180 days ( log-transformed P = .049 ) . CONCLUSION Coaching chronically ill older patients and their caregivers to ensure that their needs are met during care transitions may reduce the rates of subsequent rehospitalization BACKGROUND Nursing approaches to manage patients with heart failure ( HF ) showed benefits in reducing the morbidity and mortality . However , combining intra-hospital education with telephone contact after hospital discharge has been little explored . OBJECTIVE To compare two nursing intervention groups among patients hospitalized due to decompensated HF : the intervention group ( IG ) received educational nursing intervention during hospitalization followed by telephone monitoring after discharge and the control group ( CG ) received in-hospital intervention only . Outcomes were levels of HF and self-care knowledge , the frequency of visits to the emergency room , rehospitalizations and deaths in a three-month period . METHODS R and omized clinical trial . We studied adult HF patients with left ventricle ejection fraction ( LVEF ) < 45 % who could be contacted by telephone after discharge . HF awareness was evaluated through a st and ardized question naire that also included questions regarding self-care knowledge , which was answered during the hospitalization period and three months later . For patients in the IG group contacts were made using phone calls and final interviews were conducted in both groups at end of the study . RESULTS Forty-eight patients were assigned to the IG and 63 to the CG . Mean age ( 63 ± 13 years ) and L ( around 29 % ) were similar in the two groups . Scores for HF and self-care knowledge were similar at baseline . Three months later , both groups showed significantly improved HF awareness and self-care knowledge scores ( P < 0.001 ) . Other outcomes were similar . CONCLUSION An in-hospital educational nursing intervention benefitted all HF patients in underst and ing their disease , regardless of telephone contact after discharge Background : home visits and telephone calls are two often used approaches in transitional care but their differential effects are unknown . Objective : to examine the overall effects of a transitional care programme for discharged medical patients and the differential effects of telephone calls only . Design : r and omised controlled trial . Setting : a regional hospital in Hong Kong . Participants : patients discharged from medical units fitting the inclusion criteria ( n = 610 ) were r and omly assigned to : control ( ‘ control ’ , n = 210 ) , home visits with calls ( ‘ home ’ , n = 196 ) and calls only ( ‘ call ’ , n = 204 ) . Intervention : the home groups received alternative home visits and calls and the call groups calls only for 4 weeks . The control group received two placebo calls . The nurse case manager was supported by nursing students in delivering the interventions . Results : the home visit group ( after 4 weeks 10.7 % , after 12 weeks 21.4 % ) and the call group ( 11.8 , 20.6 % ) had lower readmission rates than the control group ( 17.6 , 25.7 % ) . Significance differences were detected in intention-to-treat ( ITT ) analysis for the home and intervention group ( home and call combined ) at 4 weeks . In the per- protocol analysis ( PPA ) results , significant differences were found in all groups at 4 weeks . There was significant improvement in quality of life , self-efficacy and satisfaction in both ITT and PPA for the study groups . Conclusions : this study has found that bundled interventions involving both home visits and calls are more effective in reducing readmissions . Many of the transitional care programmes use all-qualified nurses , and this study reveals that a mixed skills model seems to bring about positive effects as well Objective To evaluate the impact of pharmacotherapeutic counseling on the rates and causes of 30-day post-discharge hospital readmissions and emergency department visits . Setting The study was conducted at the Medical Clinic of University Hospital Dubrava , Zagreb , Croatia . Methods The study included elderly patients prescribed with two or more medications for the treatment of chronic diseases . The patients r and omized into the intervention group received pre-discharge counseling by the clinical pharmacologist about each prescribed medication . The control group received no counseling . Main outcome measures The rates and causes of 30-day postdischarge hospital readmissions and emergency department visits . Medication compliance was also evaluated , using the pill count method . Results A total of 160 patients were r and omly selected for the study . No significant difference was found in the readmission and emergency department visit rates between the intervention and control groups ( p = 0.224 ) . There were 34.9 % more compliant patients in the intervention group . Significantly more non-compliant patients in the control group were readmitted or visited emergency department because of the disease progression ( p = 0.031 ) . In the intervention group , significantly more patients were readmitted or visited emergency department because of an adverse drug reaction ( p = 0.022 ) . Conclusion Pharmacotherapeutic counseling can reduce readmission and emergency department visit rates for disease progression . Improved patient knowledge about adverse drug reactions could be the reason for increased rates of readmissions and emergency department visits due to adverse drug reactions in the intervention group OBJECTIVES To determine if post-discharge telephonic case management ( CM ) reduces emergent hospital readmissions for select high-risk patients . STUDY DESIGN Prospect i ve , r and omized . METHODS We conducted a prospect i ve , r and omized control study of the effect of hospital discharge planning from health plan telephonic case managers on readmissions for high-risk patients . High risk was defined as having an initial discharge major diagnosis of gastrointestinal , heart , or lower respiratory and length of stay of 3 days or more . The intervention group ( N = 1994 ) received telephonic outreach and engagement within 24 hours of discharge and their calls were made in descending risk order to engage the highest risk first . The control group ( N = 1994 ) received delayed telephonic outreach and engagement 48 hours after discharge notification and no call order by risk was applied . Comparison groups had statistically equivalent characteristics at baseline ( P > .05 ) . RESULTS The intent-to-treat 60-day readmission rate for the treatment group was 7.4 % versus 9.6 % for the control group ( P = .01 ) , representing a 22 % relative reduction in all-cause readmissions . Two post hoc assessment s were conducted to identify potential mechanisms of action for this effect and showed that the treatment group had more physician visits and prescription drug fills following initial discharge . CONCLUSIONS Telephonic CM reduces the likelihood of 60-day readmissions for select high-risk patients . This study suggests that prioritizing telephonic outreach to a select group of highrisk patients based on their discharge date and risk severity is an effective case management strategy . Future studies should explore patients ' activity beyond phone calls to further explain the mechanism for readmission reduction PURPOSE Several r and omized trials have found that discharge planning improves outcomes for hospitalized patients . We do not know if adding a clinical nurse specialist ( CNS ) to physician teams in hospitals that already have discharge planning services makes a difference . METHODS In 2 teaching hospitals , patients were r and omly assigned to regular hospital care or care with a clinical nurse specialist . The clinical nurse specialist facilitated hospital care by retrieving preadmission information , arranging in-hospital consultations and investigations , organizing postdischarge follow-up visits , and checking up on patients postdischarge with a telephone call . In-hospital outcomes included mortality and length of stay . Postdischarge outcomes included time to readmission or death , patient satisfaction , and the risk of adverse event . Adverse events were poor outcomes due to medical care rather than the natural history of disease . RESULTS A total of 620 sequential patients were r and omized ( CNS n = 307 , control n = 313 ) , of which 361 were followed after discharge from hospital ( CNS n = 175 , control n = 186 ) . The groups were similar for the probability of in-hospital death ( CNS 9.3 % vs control 9.7 % ) or being discharged to the community ( 58.0 % vs 60.0 % ) . The groups did not differ for postdischarge outcomes including readmission or death ( 21.6 % vs 15.6 % ; P = 0.16 ) or risk of adverse event ( 23.6 % vs 22.8 % ) . Mean [ SD ] patient ratings of overall quality of care on a scale of 10 was higher in the clinical nurse specialist group ( 8.2 [ 2.2 ] vs 7.6 [ 2.4 ] ; P = 0.052 ) . CONCLUSION The addition of a clinical nurse specialist to a medical team improved patient satisfaction but did not impact hospital efficiency or patient safety Remote monitoring ( RM ) of homebound heart failure ( HF ) patients has previously been shown to reduce hospital admissions . We conducted a pilot trial of ambulatory , non-homebound patients recently hospitalized for HF to determine whether RM could be successfully implemented in the ambulatory setting . Eligible patients from Massachusetts General Hospital ( n = 150 ) were r and omized to a control group ( n = 68 ) or to a group that was offered RM ( n = 82 ) . The participants transmitted vital signs data to a nurse who coordinated care with the physician over the course of the 6-month study . Participants in the RM program had a lower all-cause per person readmission rate ( mean = 0.64 , SD ± 0.87 ) compared to the usual care group ( mean = 0.73 , SD ± 1.51 ; P-value = .75 ) although the difference was not statistically significant . HF-related readmission rate was similarly reduced in participants . This pilot study demonstrates that RM can be successfully implemented in non-homebound HF patients and may reduce readmission rates Background . The growing number of patients with congestive heart failure has increased both the pressure on hospital re sources and the need for community management of the condition . Improving hospital-to-home transition for this population is a logical step in responding to current practice guidelines ’ recommendations for coordination and education . Positive outcomes have been reported from trials evaluating multiple interventions , enhanced hospital discharge , and follow-up through the addition of a case management role . The question remains if similar gains could be achieved working with usual hospital and community nurses . Methods . A 12-week , prospect i ve , r and omized controlled trial was conducted of the effect of transitional care on health-related quality of life ( disease-specific and generic measures ) , rates of readmission , and emergency room use . The nurse-led intervention focused on the transition from hospital-to-home and supportive care for self-management 2 weeks after hospital discharge . Results . At 6 weeks after hospital discharge , the overall Minnesota Living with Heart Failure Question naire ( MLHFQ ) score was better among the Transitional Care patients ( 27.2 ± 19.1 SD ) than among the Usual Care patients ( 37.5 ± 20.3 SD;P = 0.002 ) . Similar results were found at 12 weeks postdischarge for the overall MLHFQ and at 6- and 12-weeks postdischarge for the MLHFQ ’s Physical Dimension and Emotional Dimension subscales . Differences in generic quality life , as assessed by the SF-36 Physical component , Mental Component , and General Health subscales , were not significantly different between the Transition and Usual Care groups . At 12 weeks postdischarge , 31 % of the Usual Care patients had been readmitted compared with 23 % of the Transitional Care patients ( P = 0.26 ) , and 46 % of the Usual Care group visited the emergency department compared with 29 % in the Transitional Care group ( & khgr;2 = 4.86 , df 1 , P = 0.03 ) . Conclusions . There were significant improvements in health-related quality of life ( HRQL ) associated with Transitional Care and less use of emergency rooms BACKGROUND the usefulness of geriatric evaluation and management ( GEM ) approaches in the care of frail elderly patients remains uncertain . We examined whether an inpatient geriatric consultation service might be beneficial in a country with a social welfare system . METHODS we conducted a r and omised trial with 345 patients from five centres . Ninety additional patients from four separate centres without GEM teams served as an external comparison . All patients were hospitalised , at least 65 years and frail . Patients were r and omly assigned to either comprehensive geriatric assessment and management in the form of consultations and follow-up or usual care . Primary outcomes were rehospitalisation and nursing home placement 1 year after r and omisation . Secondary outcomes were survival , functional , emotional and cognitive status , social situation and quality of life . FINDINGS at 12 months , the groups did not differ in the rate of rehospitalisation ( intervention 67 % , control 60 % , P=0.30 ) , nursing home placement ( intervention 19 % , control 14 % , P=0.27 ) , survival ( intervention 81 % , control 85 % , P=0.56 ) or any of the other secondary measures . The external comparison groups were also similar in nursing home placement ( 16 % , P=0.40 ) , survival ( 80 % , P=0.88 ) and all the secondary variables , but rehospitalisation was less ( 48 % , P=0.04 ) . No subgroup benefited from the intervention . INTERPRETATION care provided by consultation teams did not improve the rates of rehospitalisation or nursing home placement . This is not due to carry-over effects of geriatric knowledge into the control group Background : This study evaluated the effectiveness of using trained volunteer staff in reducing 30-day readmissions of congestive heart failure ( CHF ) patients . Methods : From June 2010 to December 2010 , 137 patients ( mean age 73 years ) hospitalized for CHF were r and omly assigned to either : an interventional arm ( arm A ) receiving dietary and pharmacologic education by a trained volunteer , follow-up telephone calls within 48 hours , and a month of weekly calls ; ora control arm ( arm B ) receiving st and ard care . Primary outcomes were 30-day readmission rates for CHF and worsening New York Heart Association ( NYHA ) functional classification ; composite and all-cause mortality were secondary outcomes . Results : Arm A patients had decreased 30-day readmissions ( 7 % vs 19 % ; P ! .05 ) with a relative risk reduction ( RRR ) of 63 % and an absolute risk reduction ( ARR ) of 12 % . The composite outcome of 30-day readmission , worsening NYHA functional class , and death was decreased in the arm A ( 24 % vs 49%;P ! .05 ; RRR 51 % , ARR 25 % ) . St and ard-care treatment and hypertension , age $ 65 years and hypertension , and cigarette smoking were predictors of increased risk for readmissions , worsening NYHA functional class , and all-cause mortality , respectively , in the multivariable analysis . Conclusions : Utilizing trained volunteer staff to improve patient education and engagement might be an efficient and low-cost intervention to reduce CHF readmissions IMPORTANCE Socioeconomic and behavioral factors can negatively influence posthospital outcomes among patients of low socioeconomic status ( SES ) . Traditional hospital personnel often lack the time , skills , and community linkages required to address these factors . OBJECTIVE To determine whether a tailored community health worker ( CHW ) intervention would improve posthospital outcomes among low-SES patients . DESIGN , SETTING , AND PARTICIPANTS A 2-armed , single-blind , r and omized clinical trial was conducted between April 10 , 2011 , and October 30 , 2012 , at 2 urban , academically affiliated hospitals . Of 683 eligible general medical in patients ( ie , low-income , uninsured , or Medicaid ) that we screened , 237 individuals ( 34.7 % ) declined to participate . The remaining 446 patients ( 65.3 % ) were enrolled and r and omly assigned to study arms . Nearly equal percentages of control and intervention group patients completed the follow-up interview ( 86.6 % vs 86.9 % ) . INTERVENTIONS During hospital admission , CHWs worked with patients to create individualized action plans for achieving patients ' stated goals for recovery . The CHWs provided support tailored to patient goals for a minimum of 2 weeks . MAIN OUTCOMES AND MEASURES The prespecified primary outcome was completion of primary care follow-up within 14 days of discharge . Prespecified secondary outcomes were quality of discharge communication , self-rated health , satisfaction , patient activation , medication adherence , and 30-day readmission rates . RESULTS Using intention-to-treat analysis , we found that intervention patients were more likely to obtain timely posthospital primary care ( 60.0 % vs 47.9 % ; P = .02 ; adjusted odds ratio [ OR ] , 1.52 ; 95 % CI , 1.03 - 2.23 ) , to report high- quality discharge communication ( 91.3 % vs 78.7 % ; P = .002 ; adjusted OR , 2.94 ; 95 % CI , 1.5 - 5.8 ) , and to show greater improvements in mental health ( 6.7 vs 4.5 ; P = .02 ) and patient activation ( 3.4 vs 1.6 ; P = .05 ) . There were no significant differences between groups in physical health , satisfaction with medical care , or medication adherence . Similar proportions of patients in both arms experienced at least one 30-day readmission ; however , intervention patients were less likely to have multiple 30-day readmissions ( 2.3 % vs 5.5 % ; P = .08 ; adjusted OR , 0.40 ; 95 % CI , 0.14 - 1.06 ) . Among the subgroup of 63 readmitted patients , recurrent readmission was reduced from 40.0 % vs 15.2 % ( P = .03 ; adjusted OR , 0.27 ; 95 % CI , 0.08 - 0.89 ) . CONCLUSIONS AND RELEVANCE Patient-centered CHW intervention improves access to primary care and quality of discharge while controlling recurrent readmissions in a high-risk population . Health systems may leverage the CHW workforce to improve posthospital outcomes by addressing behavioral and socioeconomic drivers of disease . TRIAL REGISTRATION clinical trials.gov Identifier : NCT01346462 AIM The objective of this study was to examine the effectiveness of a discharge plan in hospitalized elderly patients with hip fracture due to falling . BACKGROUND Hip fractures are an important cause of morbidity and mortality among older people . Hip fracture patients require ongoing medical and long-term care services . Discharge plan services can play a very important role for these patients , since the services improved their outcome conditions . METHODS Hip fracture patients aged 65 years and older ( n = 126 ) , hospitalized due to falling and discharged from a medical centre in northern Taiwan , were r and omly assigned to either a comparison group ( the routine care ) or experimental group ( the discharge planning intervention ) . The outcomes used to determine the effectiveness of the intervention were : length of hospitalized stay , rate of readmission , repeat falls and survival , and activities of daily living . RESULTS The discharge planning intervention decreased length of stay , rate of readmission and rate of survival and improved activities of daily living for intervention group compared with those of control group . Mean total SF-36 scores of patients in the experimental group were higher than for the control group and both groups had improved quality of life . CONCLUSION The discharge planning benefited older people with hip fractures . RELEVANCE TO CLINICAL PRACTICE A discharge planning intervention by a nurse can improve physical outcomes and quality of life in hip fracture patients OBJECTIVE To pilot-test the feasibility and preliminary effect of a community health worker ( CHW ) intervention to reduce hospital readmissions . DESIGN Patient-level r and omized quality improvement intervention . SETTING An academic medical center serving a predominantly low-income population in the Boston , Massachusetts area and 10 affiliated primary care practice s. PARTICIPANTS Medical service patients with an in-network primary care physician who were discharged to home ( n = 423 ) and had one of five risk factors for readmission within 30 days . INTERVENTION Inpatient introductory visit and weekly post-discharge telephonic support for 4 weeks to assist patient in coordinating medical visits , obtaining and using medications , and in self-management . MAIN OUTCOME MEASURES Number of completed CHW contacts ; CHW-reported barriers and facilitators to assisting patients ; primary care , emergency department and inpatient care use . RESULTS Roughly 70 % of patients received at least one post-discharge CHW call ; only 38 % of patients received at least four calls as intended . Hospital readmission rates were lower among CHW patients ( 15.4 % ) compared with usual care ( 17.9 % ) ; the difference was not statistically significant . CONCLUSION Under performance-based payment systems , identifying cost-effective solutions for reducing hospital readmissions will be crucial to the economic survival of all hospitals , especially safety-net systems . This pilot study suggests that with appropriate supportive infrastructure , hospital-based CHWs may represent a feasible strategy for improving transitional care among vulnerable population s. An ongoing , r and omized , controlled trial of a CHW intervention , developed according to the lessons of this pilot , will provide further insight into the utility of this approach to reducing readmissions OBJECTIVES To evaluate an interdisciplinary intervention program for older people with hip fracture in Taiwan . DESIGN R and omized experimental design . SETTING A 3,800-bed medical center in northern Taiwan . PARTICIPANTS Elderly patients with hip fracture ( N=137 ) were r and omly assigned to an experimental ( n=68 ) or control ( n=69 ) group . INTERVENTION An interdisciplinary program of geriatric consultation , continuous rehabilitation , and discharge planning . MEASUREMENTS Demographic and outcome variables were measured . Outcome variables included service utilization , clinical outcomes , self-care abilities , health-related quality -of-life ( HRQOL ) outcomes , and depressive symptoms . RESULTS Subjects in the experimental group improved significantly more than those in the control group in the following outcomes : ratio of hip flexion 1 month after discharge ( P=.02 ) , recovery of previous walking ability at 1 month ( P=.04 ) and 3 months ( P=.001 ) after discharge , and activities of daily living at 1 month ( P=.01 ) and 2 months ( P=.001 ) after discharge . Three months after discharge , the experimental group showed significant improvement in peak force of the fractured limb 's quadriceps ( P=.04 ) and the following health outcomes : bodily pain ( P=.03 ) , vitality ( P<.001 ) , mental health ( P=.02 ) , physical function ( P<.001 ) , and role physical ( P=.006 ) . They also had fewer depressive symptoms ( P=.008 ) 3 months after discharge . CONCLUSION This intervention program may benefit older people with hip fractures in Taiwan by improving their clinical outcomes , self-care abilities , and HRQOL and by decreasing depressive symptoms within 3 months after discharge Background The importance of comprehensive discharge planning plus post discharge support intervention studies have been advocated for patients with heart failure . Telephone‐based post‐discharge nursing care was reported as a significant nursing care approach that is believed to promote continuity of care . However , there is a lack of systematic ally examined evidence to support the effectiveness of nurse‐led telephone‐based post‐discharge care for the heart failure patients . Objective The purpose of this study was to examine the best available evidence to determine the effectiveness of telephone‐based post‐discharge nursing care of patient with HF and to quantify the effect on all‐cause readmission rates of these patients . Search strategy A thorough search of published and unpublished literature was conducted from 1990 to 2009 . First , computer data bases such as CINAHL , MEDLINE , COCHRANE EMBASE , and Dissertation Abstract s International were included and then , reference search was conducted with relevant studies . Initial keywords were “ readmission , ” “ heart failure , ” “ * tele * ” , “ nurs * , “ post‐discharge ” , telephone‐based . ” Selection criteria R and omized controlled trial studies using the telephone‐based post‐discharge nursing care was selected . Outcome was defined as frequency of unplanned HF symptoms associated all‐cause hospital readmission rate during the 6 months after discharge . Data collection and analysis Included studies were critically evaluated for method ological quality by independent two review ers . Data were extracted and then analyzed using the JBI‐MASTARI and the RevMan 5 program . Main results A total of 10 r and omized controlled trial studies were included for this study . The study result showed that telephone‐based post‐discharge nursing care was a significant intervention especially decreasing hospital readmission rate among HF patients . Conclusion The findings of this review suggest a positive effect of telephone‐based post‐discharge nursing care and that it is an effective intervention in decreasing readmission rate among HF patients . Implication s for practice / research The results of this systematic review indicate the need for well design ed studies that examine other aspects of post‐discharge nursing telephone interventions to decrease hospital readmission rate among HF patients including : cost‐effectiveness , feasibility and appropriateness We studied whether pharmacists involved in discharge planning can improve patient satisfaction and outcomes by providing telephone follow-up after hospital discharge . We conducted a r and omized trial at the General Medical Service of an academic teaching hospital . We enrolled General Medical Service patients who received pharmacy-facilitated discharge from the hospital to home . The intervention consisted of a follow-up phone call by a pharmacist 2 days after discharge . During the phone call , pharmacists asked patients about their medications , including whether they obtained and understood how to take them . Two weeks after discharge , we mailed all patients a question naire to assess satisfaction with hospitalization and review ed hospital records . Of the 1,958 patients discharged from the General Medical Service from August 1 , 1998 to March 31 , 1999 , 221 patients consented to participate . We r and omized 110 to the intervention group ( phone call ) and 111 to the control group ( no phone call ) . Patients returned 145 ( 66 % ) surveys . More patients in the phone call than the no phone call group were satisfied with discharge medication instructions ( 86 % vs. 61 % , P = 0.007 ) . The phone call allowed pharmacists to identify and resolve medication-related problems for 15 patients ( 19 % ) . Twelve patients ( 15 % ) contacted by telephone reported new medical problems requiring referral to their inpatient team . Fewer patients from the phone call group returned to the emergency department within 30 days ( 10 % phone call vs. 24 % no phone call , P = 0.005 ) . A follow-up phone call by a pharmacist involved in the hospital care of patients was associated with increased patient satisfaction , resolution of medication-related problems , and fewer return visits to the emergency department BACKGROUND Many patients encounter problems in the first weeks after discharge from hospital . Telephone follow-up ( TFU ) is reputed to be a good tool for providing medical advice , managing symptoms , identifying complications and giving reassurance after discharge . Therefore , we aim ed to study whether tight TFU would increase patient satisfaction , improve compliance and reduce re-hospitalization rate . METHODS The study population included 400 patients , hospitalized in an Internal Medicine Department , r and omly divided into two groups ; TFU and control . TFU took place one week and one month after discharge . Three months later , members of both groups were contacted by telephone . RESULTS Satisfaction was increased in the TFU group compared with control group by 6 - 12 % in most fields . Notably , 87 % of patients in the TFU group indicated that earlier telephone contact increased their satisfaction . In addition , 78.2 % of the patients in the control group reported that they performed the tests that were recommended at discharge and 86.5 % reported that they received explanations regarding their medications . In the TFU group , this percentage was increased significantly to 86.9 % ( P=0.02 ) and 96.7 % ( P<0.0001 ) , respectively . As to treatment results , 93 % of the patients in the TFU group as compared to 84 % in the control group reported improvement in their symptoms . A non-significant trend towards fewer readmission was observed in the TFU group ( 26 % vs. 35 % P=0.062 ) . CONCLUSIONS TFU can improve medical treatment by increasing satisfaction and compliance . A trend towards decreased readmission rates was observed , which may lead to a reduction in the burden on the medical system OBJECTIVE The purpose of this study was to determine the effect of a tailored message intervention on heart failure readmission rates , quality of life , and health beliefs in persons with heart failure ( HF ) . DESIGN This r and omized control trial provided a tailored message intervention during hospitalization and 1 week and 1 month after discharge . Theoretic framework The organizing framework was the Health Belief Model . SUBJECTS Seventy persons with a primary diagnosis of chronic HF were included in the study . RESULTS HF readmission rates and quality of life did not significantly differ between the treatment and control groups . Health beliefs , except for benefits of medications , significantly changed from baseline in the treatment group in directions posited by the Health Belief Model . CONCLUSIONS A tailored message intervention changed the beliefs of the person with HF in regard to the benefits and barriers of taking medications , following a sodium-restricted diet , and self-monitoring for signs of fluid overload . Future research is needed to explore the effect of health belief changes on actual self-care behaviors Providing effective discharge instructions , appropriate dose uptitration , education regarding heart failure ( HF ) monitoring , and strict follow-up have all been shown to decrease readmissions for HF but are all underutilized . The authors developed and evaluated the impact of a quality -improvement HF checklist as a tool to remind physicians to improve quality of care in HF patients . The checklist was used in r and omly selected patients admitted with a primary diagnosis of acute decompensated HF . It included documentation regarding medications and dose uptitration , relevant counseling , and follow-up instructions at discharge . The checklist was used in 48 patients , and this checklist group was compared with 48 patients as a r and omly selected control group . Higher proportions of patients were taking angiotensin-converting enzyme ( ACE ) inhibitors or angiotensin receptor blockers ( ARBs ) in the checklist group compared with the control group ( 40 of 48 vs 23 of 48 , P<.001 ) . Compared with the controls , the rate of dose uptitration for β-blockers and /or ACE inhibitors/ARBs was more common in the checklist group ( 4 of 48 vs 21 of 48 , P<.001 ) . Both 30-day ( 19 % to 6 % ) and 6-month ( 42 % to 23 % ) readmissions were lower in the checklist group . The use of an HF checklist was associated with better quality of care and decreased readmission rates for patients admitted with HF OBJECTIVE To study the effects of a comprehensive discharge planning protocol , design ed specifically for the elderly and implemented by nurse specialists , on patient and caregiver outcomes and cost of care . DESIGN R and omized clinical trial . SETTING Hospital of the University of Pennsylvania . PATIENTS 276 patients and 125 caregivers . Patients were 70 years and older and were placed in selected medical and surgical cardiac diagnostic-related groups . MEASUREMENTS Group differences in patient outcomes ( length of initial hospital stay , length of time between initial hospital discharge and readmission , and rehospitalization rates ) and charges for care ( charges for initial hospitalization , rehospitalizations , health services after discharge , and nurse specialist services ) were measured 2 , 6 , and 12 weeks after discharge . RESULTS From the initial hospital discharge to 6 weeks after discharge , patients in the medical intervention group had fewer readmissions , fewer total days rehospitalized , lower readmission charges , and lower charges for health care services after discharge . No differences in these outcomes were found between the surgical intervention and control groups during this period . CONCLUSIONS Study findings support the need for comprehensive discharge planning design ed for the elderly and implemented by nurse specialists to improve their outcomes after hospital discharge and to achieve cost savings . The findings also suggest that this intervention had its greatest effect in delaying or preventing rehospitalization of patients in the medical intervention group during the first 6 weeks after discharge BACKGROUND AND OBJECTIVE In COPD , hospital admissions and readmissions account for the majority of health-care costs . The aim of this prospect i ve r and omized controlled study was to determine if early pulmonary rehabilitation , commenced as an inpatient and continued after discharge , reduced acute health-care utilization . METHODS Consecutive COPD patients ( n = 397 ) , admitted with an exacerbation , were screened : 228 satisfied the eligibility criteria , of whom 97 consented to r and omization to rehabilitation or usual care . Both intention-to-treat and per- protocol analyses are reported with adherence being defined a priori as participation in at least 75 % of rehabilitation sessions . RESULTS The participants were elderly with severe impairment of pulmonary function , poor health-related quality of life and high COPD -related morbidity . The rehabilitation group demonstrated a 23 % ( 95 % CI : 11 - 36 % ) risk of readmission at 3 months , with attendees having a 16 % ( 95 % CI : 0 - 32 % ) risk compared with 32 % ( 95 % CI : 19 - 45 % ) for usual care . These differences were not significant . There were a total of 79 COPD -related readmission days ( 1.7 per patient , 95 % CI : 0.6 - 2.7 , P = 0.19 ) in the rehabilitation group , compared with 25 ( 1.3 per patient , 95 % CI : 0 - 3.1 , P = 0.17 ) for the attendees and 209 ( 4.2 per patient , 95 % CI : 1.7 - 6.7 ) for usual care . The BMI , airflow obstruction , dyspnoea and exercise capacity index showed a non-significant trend to greater improvement among attendees compared with those receiving usual care ( 5.5 ( 2.3 ) and 5.6 ( 2.7 ) at baseline , improving to 3.7 ( 1.9 ) and 4.5 ( 2.5 ) , respectively , at 3 months ) . No adverse effects were identified . CONCLUSIONS Early inpatient-outpatient rehabilitation for COPD patients admitted with an exacerbation was feasible and safe , and was associated with a non-significant trend towards reduced acute health-care utilization We compared two models of assistance ( telecardiology versus usual care ) for patients discharged after acute coronary syndrome ( ACS ) , in the assessment of angina . Two hundred patients were r and omized into two groups at discharge for ACS : Group A to telecardiology and Group B to usual care . Early hospital readmission ( in the first month ) occurred in 16 patients ( seven in Group A and nine in Group B ) . Six of Group A were readmitted for a cardiac cause ( non-cardiac in one ) . Angina was the only cardiac cause . Five of the Group B patients were readmitted for a cardiac cause ( non-cardiac in four ) . The results of the present study emphasize that patients with ACS suffer from a definite rate of cardiac symptoms within the first month ( 63 % ) . Angina occurs more frequently within the first two weeks ( 68 % of cases ) . Telecardiology slightly reduces hospital readmissions ( telecardiology 44 % versus usual care 56 % ) , but better identifies true angina PRINCIPLES International guidelines for heart failure ( HF ) care recommend the implementation of inter-professional disease management programmes . To date , no such programme has been tested in Switzerl and . The aim of this r and omised controlled trial ( RCT ) was to test the effect on hospitalisation , mortality and quality of life of an adult ambulatory disease management programme for patients with HF in Switzerl and . METHODS Consecutive patients admitted to internal medicine in a Swiss university hospital were screened for decompensated HF . A total of 42 eligible patients were r and omised to an intervention ( n = 22 ) or usual care group ( n = 20 ) . Medical treatment was optimised and lifestyle recommendations were given to all patients . Intervention patients additionally received a home visit by a HF-nurse , followed by 17 telephone calls of decreasing frequency over 12 months , focusing on self-care . Calls from the HF nurse to primary care physicians communicated health concerns and identified goals of care . Data were collected at baseline , 3 , 6 , 9 and 12 months . Mixed regression analysis ( quality of life ) was used . Outcome assessment was conducted by research ers blinded to group assignment . RESULTS After 12 months , 22 ( 52 % ) patients had an all-cause re-admission or died . Only 3 patients were hospitalised with HF decompensation . No significant effect of the intervention was found on HF related to quality of life . CONCLUSIONS An inter-professional disease management programme is possible in the Swiss healthcare setting but effects on outcomes need to be confirmed in larger studies Background The aim of discharge planning is to reduce unplanned readmission to the hospital and improve coordination of care . As the global population ages , including family caregivers in discharge planning maybe an effective strategy to improve patient outcomes and quality of care . In the United States , hospital readmission following discharge for community acquired pneumonia is a process measure utilized to evaluate discharge planning outcomes . Objectives The objective of this systematic review was to identify the effect of patient‐caregiver dyad discharge learning need interventions on unexpected readmissions within thirty days of elderly patients ( 65 years or older ) with community acquired pneumonia . Methods A systematic review of English language studies published and unpublished after 1991on patient and caregiver learning needs related to discharge education and unplanned hospital readmission . Studies including patients aged 65 and older experiencing discharge from a hospital setting , were the primary focus . A comprehensive search of electronic data bases was performed for the period of 1991‐2010 to find relevant studies . A three‐stage search processwas used consisting of an initial data base search , scanning of reference lists , citation search ing of key papers , contact with authors via the Internet and through personal communication , keyword search ing of the World Wide Web . Four authors independently undertook quality assessment and data analysis using the st and ardized critical appraisal and data extraction tool from the Joanna Briggs Institute Meta Analysis of Statistics Assessment and Review Instrument . Data were summarized by use of narrative methods . Main results Five studies were pertinent to the review . Direct comparisons of the studies are limited due to different methods and subject participants . We found statistically significant results that caregiver education interventions impact positively on decreasing unplanned readmissions . Two r and omized control trials and one quasi‐experimental study were identified that addressed a multidisciplinary discharge education intervention which included caregivers of elderly pneumonia patients . Multidisciplinary care transition intervention was associated with fewer readmissions in thirty days ( odds ratio = 0.55 ; 95 % CI 0.32‐0.94 ; odds ratio = 0.56 ; 95 % CI 0.24‐1.31 ; odds ratio= 0.41 ; 95 % CI 0.28‐0.75 ) . One case control study and one quasi experimental study identified lack of documented patient or family education as independently associated with unplanned readmissions within thirty days in elderly patient population s that included chronic lung disease ( odds ratio = 2.3 ; 95 % CI = 1.2‐4.5 : odds ratio= 0.52 ; 95 % CI = 0.28‐0.96).Wherein , the first of these studies calculated the rate of readmission when caregiver education is omitted whilst the second calculated readmission when an educational intervention is included . However , whilst improved readmission rates with caregiver education was unlikely due to chance , due to the difficulties in isolating caregiver education as a direct intervention , issues of homogeneity limit the generalizability of this review . Conclusions This review highlights the dearth of information on the caregiver 's need for specific information on discharge and the need for future research . Implication s for practice There is little evidence that education interventions aim ed at caregivers are uniformly effective in decreasing pneumonia readmission rates . The evidence suggests that a structured education intervention that includes caregivers probably brings about a small reduction in unexpected readmissions within thirty days for elderly patients with community acquired pneumonia . The specific caregiver learning needs assessment remains variable and uncertain . Implication s for research Research has been aim ed at implementation of community acquired pneumonia guidelines to improve mortality and unexpected readmissions within thirty days . However , quality of discharge teaching , a key component of coordinated care , is a strong predictor of readiness for discharge and readmission rates . There is a need for quality r and omized studies evaluations of the outcomes specific to patient‐caregiver dyad discharge interventions RATIONALE Care coordination has shown inconsistent results as a mechanism to reduce hospital readmission and postdischarge emergency department ( ED ) visit rates . OBJECTIVE To assess the impact of a supplemental care bundle targeting high-risk elderly in patients implemented by hospital-based staff compared to usual care on a composite outcome of hospital readmission and /or ED visitation at 30 and 60 days following discharge . PATIENTS / METHODS R and omized controlled pilot study in 41 medical in patients predisposed to unplanned readmission or postdischarge ED visitation , conducted at Baylor University Medical Center . The intervention group care bundle consisted of medication counseling/reconciliation by a clinical pharmacist ( CP ) , condition specific education/enhanced discharge planning by a care coordinator ( CC ) , and phone follow-up . RESULTS Groups had similar baseline characteristics . Intervention group readmission/ED visit rates were reduced at 30 days compared to the control group ( 10.0 % versus 38.1 % , P = 0.04 ) , but not at 60 days ( 30.0 % versus 42.9 % , P = 0.52 ) . For those patients who had a readmission/postdischarge ED visit , the time interval to this event was longer in the intervention group compared to usual care ( 36.2 versus 15.7 days , P = 0.05 ) . Study power was insufficient to reliably compare the effects of the intervention on lengths of index hospital stay between groups . CONCLUSIONS A targeted care bundle delivered to high-risk elderly in patients decreased unplanned acute health care utilization up to 30 days following discharge . Dissipation of this effect by 60 days postdischarge defines reasonable expectations for analogous hospital-based educational interventions . Further research is needed regarding the impacts of similar care bundles in larger population s across a variety of inpatient setting OBJECTIVE To assess the efficacy of a multifactorial educational intervention carried out by a pharmacist in patients with heart failure ( HF ) . METHOD A r and omized , prospect i ve , open clinical trial in patients admitted for HF . The patients assigned to the intervention group received information about the disease , drug therapy , diet education , and active telephone follow-up . Visits were completed at 2 , 6 , and 12 months . Hospital re-admissions , days of hospital stay , treatment compliance , satisfaction with the care received , and quality of life ( EuroQol ) were evaluated ; a financial study was conducted in order to assess the possible impact of the program . The intervention was performed by the pharmacy department in coordination with the cardiology unit . RESULTS 134 patients were included , with a mean age of 75 years and a low educational level . The patients of the intervention group had a higher level of treatment compliance than the patients in the control group . At 12 months of follow-up , 32.9 % fewer patients in the intervention group were admitted again vs. the control group . The mean days of hospital stay per patient in the control group were 9.6 ( SD=18.5 ) vs. 5.9 ( SD=14.1 ) in the intervention group . No differences were recorded in quality of life , but the intervention group had a higher score in the satisfaction scale at two months [ 9.0 ( SD=1.3 ) versus 8.2 ( SD=1.8 ) p=0.026 ] . Upon adjusting a Cox survival model with the ejection fraction , the patients in the intervention group had a lower risk of re-admission ( Hazard ratio 0.56 ; 95 % CI : 0.32 - 0.97 ) . The financial analysis evidence d savings in hospital costs of euro 578 per patient that were favorable to the intervention group . CONCLUSIONS Postdischarge pharmaceutical care allows for reducing the number of new admissions in patients with heart failure , the total days of hospital stay , and improves treatment compliance without increasing the costs of care BACKGROUND Disease management is effective in the general population , but it has not been tested prospect ively in a sample of solely Hispanics with heart failure ( HF ) . We tested the effectiveness of telephone case management in decreasing hospitalizations and improving health-related quality of life ( HRQL ) and depression in Hispanics of Mexican origin with HF . METHODS AND RESULTS Hospitalized Hispanics with chronic HF ( n = 134 ) were enrolled and r and omized to intervention ( n = 69 ) or usual care ( n = 65 ) . The sample was elderly ( 72 + /- 11 years ) , New York Heart Association class III/IV ( 81.3 % ) , and poorly educated ( 78.4 % less than high school education ) . Most ( 55 % ) were unacculturated into US society . Bilingual/bicultural Mexican-American registered nurses provided 6 months of st and ardized telephone case management . Data on hospitalizations were collected from automated systems at 1 , 3 , and 6 months after the index hospital discharge . Health-related quality of life and depression were measured by self-report at enrollment , 3 , and 6 months . Intention to treat analysis was used . No significant group differences were found in HF hospitalizations , the primary outcome variable ( usual care : 0.49 + /- 0.81 [ CI 0.25 - 0.73 ] ; intervention : 0.55 + /- 1.1 [ CI 0.32 - 0.78 ] at 6 months ) . No significant group differences were found in HF readmission rate , HF days in the hospital , HF cost of care , all-cause hospitalizations or cost , mortality , HRQL , or depression . CONCLUSION These results have important implication s because of the current widespread enthusiasm for disease management . Although disease management is effective in the mainstream HF patient population , in Hispanics this ill , elderly , and poorly educated , a different approach may be needed ABSTRACT BACKGROUND Patient care transitions are periods of enhanced risk . Discharge summaries have been used to communicate essential information between hospital-based physicians and primary care physicians ( PCPs ) , and may reduce rates of adverse events after discharge . OBJECTIVE To assess PCP satisfaction with an electronic discharge summary ( EDS ) program as compared to conventional dictated discharge summaries . DESIGN Cluster r and omized trial . PARTICIPANTS Four medical teams of an academic general medical service . MEASUREMENTS The primary endpoint was overall discharge summary quality , as assessed by PCPs using a 100-point visual analogue scale . Other endpoints included housestaff satisfaction ( using a 100-point scale ) , adverse outcomes after discharge ( combined endpoint of emergency department visits , readmission , and death ) , and patient underst and ing of discharge details as measured by the Care Transition Model ( CTM-3 ) score ( ranging from 0 to 100 ) . RESULTS 209 patient discharges were included over a 2-month period encompassing 1 housestaff rotation . Surveys were sent out for 188 of these patient discharges , and 119 were returned ( 63 % response rate ) . No difference in PCP-reported overall quality was observed between the 2 methods ( 86.4 for EDS vs. 84.3 for dictation ; P = 0.53 ) . Housestaff found the EDS significantly easier to use than conventional dictation ( 86.5 for EDS vs. 49.2 for dictation ; P = 0.03 ) , but there was no difference in overall housestaff satisfaction . There was no difference between discharge methods for the combined endpoint for adverse outcomes ( 22 for EDS [ 21 % ] vs. 21 for dictation [ 20 % ] ; P = 0.89 ) , or for patient underst and ing of discharge details ( CTM-3 score 80.3 for EDS vs. 81.3 for dictation ; P = 0.81 ) CONCLUSION An EDS program can be used by housestaff to more easily create hospital discharge summaries , and there was no difference in PCP satisfaction Objective : To determine the feasibility and potential impact of a non-pharmacologic multidisciplinary intervention for reducing hospital readmissions in elderly patients with congestive heart failure . Design : Prospect i ve , r and omized clinical trial , with 2:1 assignment to the study intervention or usual care . Setting : 550-bed secondary and tertiary care university teaching hospital . Patients and participants : 98 patients ≥70 years of age ( mean 79±6 years ) admitted with documented congestive heart failure . Interventions : Comprehensive multidisciplinary treatment strategy consisting of intensive teaching by a geriatric cardiac nurse , a detailed review of medications by a geriatric cardiologist with specific recommendations design ed to improve medication compliance and reduce side effects , early consultation with social services to facilitate discharge planning , dietary teaching by a hospital dietician , and close follow-up after discharge by home care and the study team . Measurements and main results : All patients were followed for 90 days after initial hospital discharge . The primary study endpoints were rehospitalization within the 90-day interval and the cumulative number of days hospitalized during follow-up . The 90-day readmission rate was 33.3 % ( 21.7%–44.9 % ) for the patients receiving the study intervention ( n=63 ) compared with 45.7 % ( 29.2–62.2 % ) for the control patients ( n=35 ) . The mean number of days hospitalized was 4.3±1.1 ( 2.1–6.5 ) for the treated patients vs. 5.7±2.0 ( 1.8–9.6 ) for the usual-care patients . In a prospect ively defined subgroup of patients at intermediate risk for readmission ( n=61 ) , readmissions were reduced by 42.2 % ( from 47.6 % to 27.5 % ; p=0.10 ) , and the average number of hospital days during follow-up decreased from 6.7±32 days to 3.2±1.2 days ( p = NS ) . Conclusions : These pilot data suggest that a comprehensive , multidisciplinary approach to reducing repetitive hospitalizations in elderly patients with congestive heart failure may lead to a reduction in readmissions and hospital days , particularly in patients at moderate risk for early rehospitalization . Further evaluation of this treatment strategy , including an assessment of the cost-effectiveness , is warranted Background : Medical in patients are at risk for suboptimal health outcomes from adverse drug events and under-use of evidence -based therapies . We sought to determine whether collaborative care including a team-based clinical pharmacist improves the quality of prescribed drug therapy and reduces hospital readmission . Methods : Multicenter , quasi-r and omized , controlled clinical trial . Consecutive patients admitted to 2 internal and 2 family medicine teams in 3 teaching hospitals between January 30 , 2006 and February 2 , 2007 were included . Team care patients received proactive clinical pharmacist services ( medication history , patient-care round participation , resolution of drug-related issues , and discharge counseling ) . Usual care patients received traditional reactive clinical pharmacist services . The primary outcome was the overall quality score measured retrospectively by a blinded chart review er using 20 indicators targeting 5 conditions . Secondary outcomes included 3- and 6-month readmission . Results : A total of 452 patients ( 220 team care , 231 usual care , mean age : 74 years , 46 % male ) met eligibility criteria . Team care patients were more likely than usual care patients to receive care specified by the indicators overall ( 56.4 % vs. 45.3 % ; adjusted mean difference : 10.4 % ; 95 % confidence interval [ CI ] : 4.9 % , 15.7 % ) and for each targeted disease state except for heart failure . Team care patients experienced fewer readmissions at 3 months ( 36.2 % vs. 45.5 % ; adjusted OR : 0.63 ; 95 % CI : 0.42 , 0.94 ) but not at 6 months ( 50.7 % vs. 56.3 % ; adjusted OR ; 0.78 ; 95 % CI : 0.53 , 1.15 ) . Conclusions : In patients admitted to internal and family medicine teams , team-based care including a clinical pharmacist , improved the overall quality of medication use and reduced rates of readmission OBJECTIVES to test the hypothesis as to whether persons newly discharged into the community following an acute stroke and assigned a stroke case manager would experience , compared to usual post-hospital care , better health-related quality of life ( HRQL ) , fewer emergency room visits and less non-elective hospitalisations . DESIGN a stratified , balanced , evaluator-blinded , r and omised clinical trial . SETTING five university-affiliated acute-care hospitals in Montreal , Quebec , Canada . PARTICIPANTS persons ( n = 190 ) returning home directly from the acute-care hospital following a first or recurrent stroke with a need for health care supervision post-discharge because of low function , co-morbidity , or isolation . INTERVENTION for 6 weeks following hospital discharge a nurse stroke care manager maintained contact with patients through home visits and telephone calls design ed to coordinate care with the person 's personal physician and link the stroke survivor into community-based stroke services . MEASUREMENTS the primary outcome was the Physical Component Summary ( PCS ) of the Short-Form (SF)-36 survey . A secondary outcome was utilisation of health services . Also measured was the impact of stroke on functioning . Measurements were made at hospital discharge ( baseline ) , following the 6-week intervention and at 6-months post-stroke . RESULTS the average age of the participants was 70 years . Discharge was achieved on average 12 days post-stroke and most participants had had a stroke of moderate severity . There were no differences between groups on the primary outcome measure , health services utilisation , or any of the secondary outcome measures . CONCLUSION for this population , there was no evidence that this type of passive case management inferred any added benefit in terms of improvement in health-related quality of life or reduction in health services utilisation and stroke impact , than usual post-discharge management With growing awareness of medical fallibility , research ers need to develop tools to identify and study medical mistakes . We examined the utility of hospital readmissions for this purpose in a prospect i ve case-control study in a large academic medical center in Israel . All patients with nonelective readmissions to 2 departments of medicine within 30 days of discharge were interviewed , and their medical records were carefully examined with emphasis on the index admission . Patient data were compared to data for age- and sex-matched controls ( n = 140 ) who were not readmitted . Medical records of readmitted and control patients were blindly evaluated by 2 senior clinicians who independently identified potential quality of care ( QOC ) problems during the index admission . Inhospital and late mortality was determined 6 months after discharge . Over a period of 3 months there were 1988 urgent admissions ; 1913 discharges and subsequently 271 unplanned readmissions occurred ( 14.1 % of discharges ) . Readmissions occurred an average of 10 days after discharge , and readmitted patients were sicker than controls ( mean , 4.3 vs. 3.3 diagnoses per patient ) , although their length of stay was similarly short ( 3.4 ± 2.8 d ) . Analysis of all readmissions revealed QOC problems in 90/271 ( 33 % ) of readmissions , 4.5 % of hospitalizations . All were deemed preventable . Interobserver agreement was good ( 83 % , kappa = 0.67 ) . Among matched controls , only 8/140 admissions revealed QOC problems ( 6 % , p < 0.001 ) ( k = 0.77 ) . The preventable readmissions mostly involved a vascular event or congestive heart failure ; they occurred within a mean of 10 ± 8 days of the index admission , and their inpatient mortality was 6.7 % vs. 1.7 % among readmissions that had no QOC problems ( odds ratio , 4.1 ; 95 % confidence interval , 1.0 - 16.7 ) . The main pitfalls identified during the index admission included incomplete workup ( 33 % ) , too short hospital stay ( 31 % ) , inappropriate medication ( 44 % ) , diagnostic error ( 16 % ) , and disregarding a significant laboratory result ( 12 % ) . In many patients more than 1 pitfall was identified ( mean , 1.5 per patient ) . Risk factors for preventable readmission include older age and living in an institution ( p < 0.05 ) . Almost two-thirds of the readmitted patients with QOC problems were discharged after spending 2 days or fewer at the hospital . In conclusion , unplanned readmissions within 30 days of discharge are frequent , more prevalent in sicker patients , and possibly associated with increased mortality . In a third of readmitted patients a QOC problem can be identified , and these problems are preventable . Thus , readmission may be used as a screening tool for potential QOC problems in the department of medicine . Routine monitoring of all readmissions may provide a simple cost-effective means of identifying and addressing medical mistakes . Abbreviations : 95 % CI = 95 % confidence interval , LOS = length of stay , NS = not significant , OR = odds ratio , QOC = quality of care The purpose of this r and omized clinical trial was to examine the effects of a comprehensive discharge planning protocol implemented by a gerontological nurse specialist as compared to the hospital 's general discharge planning procedure . There were no statistically significant differences between groups in length of initial patient hospitalization or in rates of posthospital infections . A statistically significant difference was found when groups were compared on the number of subjects rehospitalized during the study period . The findings of this pilot study reinforce the need for continued study of the impact of comprehensive discharge planning for hospitalized elderly BACKGROUND Both r and omized and nonr and omized controlled studies have linked congestive heart failure ( CHF ) case management ( CM ) to decreased readmissions and improved outcomes in mostly homogeneous setting s. The objective of this r and omized controlled trial was to test the effect of CHF CM on the 90-day readmission rate in a more heterogeneous setting . METHODS A total of 287 patients admitted to the hospital with the primary or secondary diagnosis of CHF , left ventricular dysfunction of less than 40 % , or radiologic evidence of pulmonary edema for which they underwent diuresis were r and omized . The intervention consisted of 4 major components : early discharge planning , patient and family CHF education , 12 weeks of telephone follow-up , and promotion of optimal CHF medications . RESULTS The 90-day readmission rates were equal for the CM and usual care groups ( 37 % ) . Total inpatient and outpatient median costs and readmission median cost were reduced 14 % and 26 % , respectively , for the intervention group . Patients in the CM group were more likely to be taking CHF medication at target doses , but dosages did not increase significantly throughout 12 weeks . Although both groups took their medications as prescribed equally well , the rest of the adherence to treatment plan was significantly better in the CM group . Subgroup analysis of patients who lived locally and saw a cardiologist showed a significant decrease in CHF readmissions for the intervention group ( P = .03 ) . CONCLUSIONS These results suggest several limitations to the generalizability of the CHF CM-improved outcome link in a heterogeneous setting . One explanation is that the lack of coordinated system supports and varied accessibility to care in an extended , nonnetworked physician setting limits the effectiveness of the CM OBJECTIVES To examine the effectiveness of a transitional care intervention delivered by advanced practice nurses ( APNs ) to elders hospitalized with heart failure . DESIGN R and omized , controlled trial with follow-up through 52 weeks postindex hospital discharge . SETTING Six Philadelphia academic and community hospitals . PARTICIPANTS Two hundred thirty-nine eligible patients were aged 65 and older and hospitalized with heart failure . INTERVENTION A 3-month APN-directed discharge planning and home follow-up protocol . MEASUREMENTS Time to first rehospitalization or death , number of rehospitalizations , quality of life , functional status , costs , and satisfaction with care . RESULTS Mean age of patients ( control n=121 ; intervention n=118 ) enrolled was 76 ; 43 % were male , and 36 % were African American . Time to first readmission or death was longer in intervention patients ( log rank chi(2)=5.0 , P=.026 ; Cox regression incidence density ratio=1.65 , 95 % confidence interval=1.13 - 2.40 ) . At 52 weeks , intervention group patients had fewer readmissions ( 104 vs 162 , P=.047 ) and lower mean total costs ( $ 7,636 vs $ 12,481 , P=.002 ) . For intervention patients , only short-term improvements were demonstrated in overall quality of life ( 12 weeks , P<.05 ) , physical dimension of quality of life ( 2 weeks , P<.01 ; 12 weeks , P<.05 ) and patient satisfaction ( assessed at 2 and 6 weeks , P<.001 ) . CONCLUSION A comprehensive transitional care intervention for elders hospitalized with heart failure increased the length of time between hospital discharge and readmission or death , reduced total number of rehospitalizations , and decreased healthcare costs , thus demonstrating great promise for improving clinical and economic outcomes BACKGROUND Patients are routinely ill-prepared for the transition from hospital to home . Inadequate communication between Hospitalists and primary care providers can further compromise post-discharge care . Re design ing the discharge process may improve the continuity and the quality of patient care . OBJECTIVES To evaluate a low-cost intervention design ed to promptly reconnect patients to their “ medical home ” after hospital discharge . DESIGN R and omized controlled study . Intervention patients received a “ user-friendly ” Patient Discharge Form , and upon arrival at home , a telephone outreach from a nurse at their primary care site . PARTICIPANTS A culturally and linguistically diverse group of patients admitted to a small community teaching hospital . MEASUREMENTS Four undesirable outcomes were measured after hospital discharge : ( 1 ) no outpatient follow-up within 21 days ; ( 2 ) readmission within 31 days ; ( 3 ) emergency department visit within 31 days ; and ( 4 ) failure by the primary care provider to complete an outpatient workup recommended by the hospital doctors . Outcomes of the intervention group were compared to concurrent and historical controls . RESULTS Only 25.5 % of intervention patients had 1 or more undesirable outcomes compared to 55.1 % of the concurrent and 55.0 % of the historical controls . Notably , only 14.9 % of the intervention patients failed to follow-up within 21 days compared to 40.8 % of the concurrent and 35.0 % of the historical controls . Only 11.5 % of recommended outpatient workups in the intervention group were incomplete versus 31.3 % in the concurrent and 31.0 % in the historical controls . CONCLUSIONS A low-cost discharge – transfer intervention may improve the rates of outpatient follow-up and of completed workups after hospital discharge BACKGROUND Congestive heart failure is the most common indication for admission to the hospital among older adults . Behavioral factors , such as poor compliance with treatment , frequently contribute to exacerbations of heart failure , a fact suggesting that many admissions could be prevented . METHODS We conducted a prospect i ve , r and omized trial of the effect of a nurse-directed , multidisciplinary intervention on rates of readmission within 90 days of hospital discharge , quality of life , and costs of care for high-risk patients 70 years of age or older who were hospitalized with congestive heart failure . The intervention consisted of comprehensive education of the patient and family , a prescribed diet , social-service consultation and planning for an early discharge , a review of medications , and intensive follow-up . RESULTS Survival for 90 days without readmission , the primary outcome measure , was achieved in 91 of the 142 patients in the treatment group , as compared with 75 of the 140 patients in the control group , who received conventional care ( P = 0.09 ) . There were 94 readmissions in the control group and 53 in the treatment group ( risk ratio , 0.56 ; P = 0.02 ) . The number of readmissions for heart failure was reduced by 56.2 percent in the treatment group ( 54 vs. 24 , P = 0.04 ) , whereas the number of readmissions for other causes was reduced by 28.5 percent ( 40 vs. 29 , P not significant ) . In the control group , 23 patients ( 16.4 percent ) had more than one readmission , as compared with 9 patients ( 6.3 percent ) in the treatment group ( risk ratio , 0.39 ; P = 0.01 ) . In a subgroup of 126 patients , quality -of-life scores at 90 days improved more from base line for patients in the treatment group ( P = 0.001 ) . Because of the reduction in hospital admissions , the overall cost of care was $ 460 less per patient in the treatment group . CONCLUSIONS A nurse-directed , multidisciplinary intervention can improve quality of life and reduce hospital use and medical costs for elderly patients with congestive heart failure AIM The aim of this trial was to prospect ively evaluate the effect of follow-up at a nurse-led heart failure clinic on mortality , morbidity and self-care behaviour for patients hospitalised due to heart failure for 12 months after discharge . METHODS A total of 106 patients were r and omly assigned to either follow-up at a nurse-led heart failure clinic or to usual care . The nurse-led heart failure clinic was staffed by specially educated and experienced cardiac nurses , delegated the responsibility for making protocol -led changes in medications . The first follow-up visit was 2 - 3 weeks after discharge . During the visit the nurse evaluated the heart failure status and the treatment , gave education about heart failure and social support to the patient and his family . RESULTS There were fewer patients with events ( death or admission ) after 12 months in the intervention group compared to the control group ( 29 vs 40 , p=0.03 ) and fewer deaths after 12 months ( 7 vs 20 , p=0.005 ) . The intervention group had fewer admissions ( 33 vs 56 , p=0.047 ) and days in hospital ( 350 vs 592 , p=0.045 ) during the first 3 months . After 12 months the intervention was associated with a 55 % decrease in admissions/patient/month ( 0.18 vs 0.40 , p=0.06 ) and fewer days in hospital/patient/month ( 1.4 vs 3.9 , p=0.02 ) . The intervention group had significantly higher self-care scores at 3 and 12 months compared to the control group ( p=0.02 and p=0.01 ) . CONCLUSIONS Follow up after hospitalisation at a nurse-led heart failure clinic can improve survival and self-care behaviour in patients with heart failure as well as reduce the number of events , readmissions and days in hospital Context Emergency department visits and rehospitalizations are common after hospital discharge . Contribution This trial demonstrated that a nurse discharge advocate and clinical pharmacist working together to coordinate hospital discharge , educate patients , and reconcile medications led to fewer follow-up emergency visits and rehospitalizations than usual care alone . Caution The trial was conducted at a single center , and not all eligible patients were enrolled . Implication A systematic approach to hospital discharges can reduce unnecessary health service use . The Editors One in 5 hospitalizations is complicated by postdischarge adverse events ( 1 , 2 ) , some of which may lead to preventable emergency department visits or readmissions . Despite this finding , hospital discharge procedures have not been st and ardized ( 3 ) . In addition , the declining presence of primary care providers ( PCPs ) in hospitals has not been adequately accompanied by systems to ensure that patient data are transferred to subsequent caregivers ( 4 , 5 ) . For example , discharge summaries frequently lack critical data and are not sent to the PCP in a timely fashion ( 6 , 7 ) , result ing in outpatient clinicians being unaware of test results that were pending at discharge ( 8) and evaluations that were scheduled to be done after discharge not being completed ( 9 ) . Similarly , patients are often left unprepared at discharge ; many do not underst and their discharge medications and can not recall their chief diagnoses ( 10 ) . With more than 32 million adult discharges in the United States each year ( 11 ) , these deficiencies in the transition of care increase illness , unnecessary hospital utilization , and cost . Some peridischarge interventions have shown a reduction in hospital readmission rates and cost ( 1214 ) , emergency department visits ( 15 ) , and postdischarge adverse events ( 16 ) , whereas some have shown little or no effect ( 1720 ) . Peridischarge interventions have also shown improved PCP follow-up and outpatient work-ups ( 21 ) and higher patient satisfaction ( 15 ) . Most of these studies have focused on specific diagnoses ( 14 , 22 , 23 ) or highly selected population s , such as geriatric adults ( 12 , 13 , 19 , 24 ) . Some have focused on specific aspects of the discharge , such as increasing access to primary care follow-up ( 25 ) , connecting with transitional nursing services ( 26 ) , or improving patients ' ability to advocate for themselves after discharge ( 12 ) . To date , no study has evaluated a st and ardized discharge intervention that includes patient education , comprehensive discharge planning , and postdischarge telephone reinforcement in a general medical population . In 2004 , we began an in-depth examination of hospital discharge , for which we design ed a package of services to minimize discharge failuresa process called reengineered discharge ( RED ) ( Table 1 ) ( 3 , 27 ) . We did a r and omized , controlled trial to evaluate the clinical effect of implementing RED among patients admitted to a general medical service . Table 1 . Components of Reengineered Hospital Discharge Methods Setting and Participants We conducted a 2-group , r and omized , controlled trial of English-speaking patients 18 years of age or older who were admitted to the medical teaching service of Boston Medical Center , Boston , Massachusettsa large , urban , safety-net hospital with an ethnically diverse patient population . Patients had to have a telephone , be able to comprehend study details and the consent process in English , and have plans to be discharged to a U.S. community . We did not enroll patients if they were admitted from a skilled nursing facility or other hospital , transferred to a different hospital service before enrollment , admitted for a planned hospitalization , were on hospital pre caution s or suicide watch , or were deaf or blind . Boston University 's institutional review board approved all study activities . R and omization Each morning , a list of admitted patients was review ed for initial eligibility ( hospital location , age , date and time of admission , and previous enrollment ) . Last names of potential participants were ranked by using a r and om-number sequence to determine the order in which to approach patients for enrollment . A trained research assistant then approached each patient and further determined eligibility according to inclusion and exclusion criteria ( Figure 1 ) . Figure 1 . Study flow diagram . * Patients did not meet inclusion criteria if they were admitted from or planned discharge to an institutional setting ( n= 74 ) , planned hospitalization ( n= 3 ) or discharge to a non-U.S. community ( n= 5 ) , were transferred to different hospital service ( n= 8) , did not speak English ( n= 371 ) or have a telephone ( n= 71 ) , were on hospital pre caution s ( n= 274 ) or suicide watch with a sitter ( n= 10 ) , were unable to consent ( n= 181 ) , had sickle cell disease as the admitting diagnosis ( n= 38 ) , had privacy status ( n= 8) , were deaf or blind ( n= 2 ) , or other ( n= 4 ) . Usual care participants did not meet eligibility criteria if they were discharged to a nursing facility ( n= 28 ) , were transferred to another hospital service ( n= 1 ) , were previously enrolled ( n= 1 ) , died during index admission ( n= 2 ) , requested to be removed ( n= 5 ) , or other ( n= 3 ) . Intervention participants did not meet eligibility criteria if they were discharged to a nursing facility ( n= 21 ) , were transferred to another hospital service ( n= 6 ) , died during index admission ( n= 1 ) , requested to be removed ( n= 2 ) , or other ( n= 8) . 107 intervention participants did not receive a reinforcement call because they could not be reached by telephone ( n= 93 ) , they were readmitted the same or next day ( n= 2 ) , there was no staffing coverage ( n= 8) , or other ( n= 4 ) . By using block r and omization ( 28 ) with varying block sizes of 6 and 8 , we r and omly arranged index cards indicating either the usual care or intervention group . We placed the cards in opaque envelopes labeled consecutively with study numbers . We assigned eligible participants who consented to enrollment to a study group by revealing the concealed index card . This process continued until 2 participants were enrolled each day of the week ( or 3 participants if the first 2 participants were r and omly assigned to the usual care group ) . This protocol ensured that research assistants could not selectively choose potential participants for enrollment or predict assignment . Participants r and omly assigned to usual care received no further intervention . There were 40 participants in the usual care group and 38 in the intervention group who were enrolled but no longer met inclusion criteria at discharge ( most commonly because they were discharged to a nursing facility ) . Because the primary analysis was by intention to treat , we included these participants in the analysis , with the exception of those who died before index discharge , requested to be removed , or were previously enrolled ( Figure 1 ) . Interventions Nurse discharge advocates ( DAs ) carried out all aspects of the in-hospital intervention . We hired 6 part-time DAs to work with intervention participants to ensure coverage by 1 DA 7 days a week , 5 hours a day . We trained all DAs to deliver the RED intervention by using a manual containing detailed scripts , observation of relevant clinical interactions , and simulated practice sessions . The primary goals of the DA were to coordinate the discharge plan with the hospital team and educate and prepare the participant for discharge . At admission , the DA completed the RED intervention components outlined in Table 1 . Additional information about the DA training manual is published elsewhere ( 3 ) and can be found on our Web site ( www.bu.edu/fammed/projectred/index.html ) . With information collected from the hospital team and the participant , the DA created the after-hospital care plan ( AHCP ) , which contained medical provider contact information , date s for appointments and tests , an appointment calendar , a color-coded medication schedule , a list of tests with pending results at discharge , an illustrated description of the discharge diagnosis , and information about what to do if a problem arises . Information for the AHCP was manually entered into a Microsoft Word ( Microsoft , Redmond , Washington ) template , printed , and spiral-bound to produce an individualized , color booklet design ed to be accessible to individuals with limited health literacy . By using scripts from the training manual , the DA used a teach-back methodology ( 29 ) to review the contents of the AHCP with the participant . On the day of discharge , the AHCP and discharge summary were faxed to the PCP . A clinical pharmacist telephoned the participants 2 to 4 days after the index discharge to reinforce the discharge plan by using a scripted interview . The pharmacist had access to the AHCP and hospital discharge summary and , over several days , made at least 3 attempts to reach each participant . The pharmacist asked participants to bring their medications to the telephone to review them and address medication-related problems ; the pharmacist communicated these issues to the PCP or DA . Outcomes Measures and Follow-up At the time of recruitment , research assistants collected baseline data , including sociodemographic characteristics ; the Short Form-12 Health Survey , Version 2 ( 30 ) ; the depression subscale from the Patient Health Question naire-9 ( 31 ) ; and the Rapid Estimate of Adult Literacy in Medicine ( 32 ) . We calculated the Charlson Comorbidity Index score by using primary and secondary diagnoses recorded on the index admission discharge summary ( 33 ) . We determined the number of hospital admissions and emergency department visits in the 6 months before index admission through medical record review ( Boston Medical Center hospital utilization ) and participant report ( all other hospital utilization ) . The primary end point was the rate of hospital utilizationthe total number of emergency department visits and readmissions per participant within 30 days of the index discharge . In this r and omized controlled trial we tested the efficacy of an intervention program ( CARE : Creating Avenues for Relative Empowerment ) for improving outcomes of hospitalized older adults and their family caregivers ( FCGs ) . FCG-patient dyads ( n = 407 ) were r and omized into two groups . The CARE group received a two-session empowerment-educational program 1 - 2 days post-admission and 1 - 3 days pre-discharge . The attention control group received a generic information program during the same timeframe . Follow-up was at 2 weeks and 2 months post-discharge . There were no statistically significant differences in patient or FCG outcomes . However , inconsistent evidence of role outcome differences suggests that CARE may benefit certain FCG subgroups instead of being a one-size-fits-all intervention strategy . Closer examination of CARE 's mechanisms and effects is needed OBJECTIVE To investigate the business case of postdischarge care transition ( PDCT ) among Medicare beneficiaries by conducting a cost-benefit analysis . DESIGN R and omized controlled trial . SETTING A general hospital in upstate New York State . PARTICIPANTS Elderly Medicare beneficiaries being treated from October 2008 through December 2009 were r and omly selected to receive services as part of a comprehensive PDCT program ( intervention--173 patients ) or regular discharge process ( control--160 patients ) and followed for 12 months . INTERVENTION The intervention comprised five activities : development of a patient-centered health record , a structured discharge preparation checklist of critical activities , delivery of patient self-activation and management sessions , follow-up appointments , and coordination of data flow . MEASUREMENTS Cost-benefit ratio of the PDCT program ; self-management skills and abilities . RESULTS The 1-year readmission analysis revealed that control participants were more likely to be readmitted than intervention participants ( 58.2 % vs 48.2 % ; P = .08 ) ; with most of that difference observed in the 91 to 365 days after discharge . Findings from the cost-benefit analysis revealed a cost-benefit ratio of 1.09 , which indicates that , for every $ 1 spent on the program , a saving of $ 1.09 was realized . In addition , participating in a care transition program significantly enhanced self-management skills and abilities . CONCLUSION Postdischarge care transition programs have a dual benefit of enhancing elderly adults ' self-management skills and abilities and producing cost savings . This study builds a case for the inclusion of PDCT programs as a reimbursable service in benefit packages Patients with chronic conditions are heavy users of the health care system . There are opportunities for significant savings and improvements to patient care if patients can be maintained in their homes . A r and omized control trial tested the impact of 3 months of telehome monitoring on hospital readmission , quality of life , and functional status in patients with heart failure or angina . The intervention consisted of video conferencing and phone line transmission of weight , blood pressure , and electrocardiograms . Telehome monitoring significantly reduced the number of hospital readmissions and days spent in the hospital for patients with angina and improved quality of life and functional status in patients with heart failure or angina . Patients found the technology easy to use and expressed high levels of satisfaction . Telehealth technologies are a viable means of providing home monitoring to patients with heart disease at high risk of hospital readmission to improve their self-care abilities OBJECTIVES To assess the effect of an electronic health record-based transitional care intervention involving automated alerts to primary care providers and staff when older adults were discharged from the hospital . DESIGN R and omized controlled trial . SETTING Large multispecialty group practice . PARTICIPANTS Individuals aged 65 and older discharged from hospital to home . INTERVENTION In addition to notifying primary care providers about the individual 's recent discharge , the system provided information about new drugs added during the inpatient stay , warnings about drug-drug interactions , recommendations for dose changes and laboratory monitoring of high-risk medications , and alerts to the primary care provider 's support staff to schedule a posthospitalization office visit . MEASUREMENTS An outpatient office visit with a primary care provider after discharge and rehospitalization within 30 days after discharge . RESULTS Of the 1,870 discharges in the intervention group , 27.7 % had an office visit with a primary care provider within 7 days of discharge . Of the 1,791 discharges in the control group , 28.3 % had an office visit with a primary care provider within 7 days of discharge . In the intervention group , 18.8 % experienced a rehospitalization within the 30-day period after discharge , compared with 19.9 % in the control group . The hazard ratio for an office visit with a primary care physician did not significantly differ between the intervention and control groups . The hazard ratio for rehospitalization in the 30-day period after hospital discharge in the intervention versus the control group was 0.94 ( 95 % confidence interval = 0.81 - 1.1 ) . CONCLUSION This electronic health record-based intervention did not have a significant effect on the timeliness of office visits to primary care providers after hospitalization or risk of rehospitalization Hospital readmission is an indicator of care quality . Studies have been conducted to test whether post-discharge transitional care programs can reduce hospital readmission , but results are not conclusive . The contemporary development of post-discharge support advocates a health and social partnership approach . There is a paucity of experimental studies examining the effects of such efforts . This study design ed a health-social transitional care management program ( HSTCMP ) and subjected it to empirical testing using a r and omized controlled trial in the medical units of an acute general hospital with 1700 beds in Hong Kong during the period of February 2009 to July 2010 . Results using per- protocol analysis revealed that the HSTCMP significantly reduced readmission at 4-weeks ( study 4.0 % , control 10.2 % , χ(2 ) = 7.98 , p = 0.005 ) . The intention-to-treat result also showed a lower readmission rate with the study group but the result was not significant ( study 11.5 % , control 14.7 % , χ(2 ) = 1.53 , p = 0.258 ) . There was however significant improvement in quality of life , self-efficacy and satisfaction in the study group in both per- protocol and intention-to-treat analyses . The study suggests that a health-social partnership , using volunteers as substitutes for some of the professional care , may be effective for general medical patients OBJECTIVE To ascertain the effectiveness of clinical pathways for improving patient outcomes and decreasing lengths of stay after hip and knee arthroplasty . DESIGN AND SETTING Twelve-month r and omised prospect i ve trial comparing patients treated through a clinical pathway with those treated by an established st and ard of care at a single tertiary referral university hospital . PARTICIPANTS 163 patients ( 56 men and 107 women ; mean age , 66 years ) undergoing primary hip or knee arthroplasty , and r and omly allocated to the clinical pathway ( 92 patients ) and the control group ( 71 patients ) . MAIN OUTCOME MEASURES Time to sitting out of bed and walking ; rates of complications and readmissions ; match to planned discharge destination ; and length of hospital stay . RESULTS Clinical pathway patients had a shorter mean length of stay ( P = 0.011 ) , earlier ambulation ( P = 0.001 ) , a lower readmission rate ( P = 0.06 ) and closer matching of discharge destination . There were beneficial effects of attending patient seminars and preadmission clinics for both pathway and control patients . CONCLUSION Clinical pathway is an effective method of improving patient outcomes and decreasing length of stay following hip and knee arthroplasty AIMS Chronic heart failure ( CHF ) patients are frequently rehospitalized within 6 months after an episode of fluid retention . Rehospitalizations are preventable , but this requires an extensive organization of the healthcare system . In this study , we tested whether intensive follow-up of patients through a telemonitoring-facilitated collaboration between general practitioners ( GPs ) and a heart failure clinic could reduce mortality and rehospitalization rate . METHODS AND RESULTS One hunderd and sixty CHF patients [ mean age 76 ± 10 years , 104 males , mean left ventricular ejection fraction ( LVEF ) 35 ± 15 % ] were block r and omized by sealed envelopes and assigned to 6 months of intense follow-up facilitated by telemonitoring ( TM ) or usual care ( UC ) . The TM group measured body weight , blood pressure , and heart rate on a daily basis with electronic devices that transferred the data automatically to an online data base . Email alerts were sent to the GP and heart failure clinic to intervene when pre-defined limits were exceeded . All-cause mortality was significantly lower in the TM group as compared with the UC group ( 5 % vs. 17.5 % , P = 0.01 ) . The total number of follow-up days lost to hospitalization , dialysis , or death was significantly lower in the TM group as compared with the UC group ( 13 vs. 30 days , P = 0.02 ) . The number of hospitalizations for heart failure per patient showed a trend ( 0.24 vs. 0.42 hospitalizations/patient , P = 0.06 ) in favour of TM . CONCLUSION Telemonitoring-facilitated collaboration between GPs and a heart failure clinic reduces mortality and number of days lost to hospitalization , death , or dialysis in CHF patients . These findings need confirmation in a large trial AIM To test the effects of a postdischarge transitional care programme among patients with coronary heart disease . BACKGROUND . Coronary heart disease is a leading cause of death worldwide . Effective postdischarge care can help patients maintain a healthy lifestyle and thereby control the risk factors . Transitional care is under-developed in mainl and China . DESIGN A r and omised controlled trial . METHOD The control group ( n = 100 ) received routine care and the study group ( n = 100 ) received the postdischarge transitional care programme , which consisted of predischarge assessment , structured home visits and telephone follow-ups within four weeks after discharge . Subjects were recruited in 2002 - 2003 , with data collected at baseline before discharge , two days and four and 12 weeks after discharge . RESULTS Participants in the study group had significantly better underst and ing in diet , medications and health-related lifestyle behaviour at day 2 and in weeks 4 and 12 and better underst and ing in exercise at weeks 4 and 12 . There were significant differences between the control and study groups in diet and health-related lifestyle at day 2 and weeks 4 and 12 , in medication at weeks 4 and 12 and exercise at week 12 . There was no difference in hospital readmission between the two groups . The study group was very satisfied with the care . There was no difference in willingness to pay for nurse follow-up services between groups . CONCLUSION This study is an original effort to establish and test a nurse-led transitional care model in China . Results demonstrate that transitional care is effective in mainl and China , concurring with studies done elsewhere . RELEVANCE TO CLINICAL PRACTICE This study has constructed a transitional care model for patients with coronary heart disease in the context of the Chinese population which is effective in enhancing healthy lifestyle among these patients OBJECTIVES To evaluate the effect of an exercise-based model of hospital and in-home follow-up care for older people at risk of hospital readmission on emergency health service utilization and quality of life . DESIGN R and omized controlled trial . SETTING Tertiary metropolitan hospital in Australia . PARTICIPANTS One hundred twenty-eight patients ( 64 intervention , 64 control ) with an acute medical admission , aged 65 and older and with at least one risk factor for readmission ( multiple comorbidities , impaired functionality , aged > or=75 , recent multiple admissions , poor social support , history of depression ) . INTERVENTION Comprehensive nursing and physiotherapy assessment and individualized program of exercise strategies and nurse-conducted home visit and telephone follow-up commencing in the hospital and continuing for 24 weeks after discharge . MEASUREMENTS Emergency health service utilization ( emergency hospital readmissions and visits to emergency department , general practitioner ( GP ) , or allied health professional ) and health-related quality of life ( Medical Outcomes Study 12-item Short Form Survey ( SF-12v2 ) collected at baseline and 4 , 12 , and 24 weeks after discharge . RESULTS The intervention group required significantly fewer emergency hospital readmissions ( 22 % of intervention group , 47 % of control group , P=.007 ) and emergency GP visits ( 25 % of intervention group , 67 % of control group , P<.001 ) . The intervention group also reported significantly greater improvements in quality of life than the control group as measured using SF-12v2 Physical Component Summary scores ( F ( 3 , 279)=30.43 , P<.001 ) and Mental Component Summary scores ( F ( 3 , 279)=7.20 , P<.001 ) . CONCLUSION Early introduction of an individualized exercise program and long-term telephone follow-up may reduce emergency health service utilization and improve quality of life of older adults at risk of hospital readmission
11,077
30,055,600
Compared with healthy controls , patients with knee osteoarthritis have higher odds of having lower muscle strength , proprioception deficits , more medial varus-valgus laxity and less lateral varus-valgus laxity . Conclusions Patients with knee osteoarthritis are more likely to display a number of biomechanical characteristics .
Background To investigate ( 1 ) the association of specific biomechanical factors with knee osteoarthritis and knee osteoarthritis development , and ( 2 ) the impact of other relevant risk factors on this association .
OBJECTIVES To examine the objective physical function of the lower extremities , to measure the properties of quadriceps femoris muscle ( QFM ) , and to assess subjective disabilities in men with knee osteoarthritis ( OA ) and to compare the results with those obtained from age- and sex-matched control subjects . DESIGN Cross-sectional study . SETTING Rehabilitation clinic in a university hospital . PARTICIPANTS Male volunteers ( n=54 ) ( age range , 50 - 69y ) with knee OA and r and omly selected healthy , age- and sex-matched control subjects ( n=53 ) . INTERVENTIONS Not applicable . MAIN OUTCOME MEASURES Physical function evaluated with a test battery including the QFM composition measurement , the Western Ontario and McMaster Universities Osteoarthritis Index ( WOMAC ) , and the R AND 36-Item Short-Form Health Survey , version 1.0 . RESULTS Knee OA patients had 13 % to 26 % poorer ( P range , .050-.001 ) physical function and muscle strength compared with the controls . There were also significant differences in QFM composition . WOMAC ( P range , .050-.001 ) and muscle strength ( P<.001 ) associated with physical function tests , but subjective pain correlated with neither physical function nor muscle strength in knee OA patients . The radiographic knee OA grade did not have any significant effect on physical function , but passive knee motion , knee extension strength , and WOMAC were related to the severity of the disease ( P<.05 ) . CONCLUSIONS The patients with knee OA exhibited impaired physical function and muscle strength and QFM composition compared with healthy controls . The severity of radiographic knee OA clearly had adverse effects on functional ability at the later stages of the disease . The results highlight the effect of QFM strength on physical function as well as the importance of patient 's subjective and objective physical function when deciding on knee OA treatment policy OBJECTIVE Although knee malalignment is assumed to correlate with knee osteoarthritis ( OA ) , it is still unknown whether malalignment precedes the development of OA or whether it is a result of OA . The aim of this study was to assess the relationship between malalignment and the development of knee OA as well as progression of knee OA . METHODS A total of 1,501 participants in the Rotterdam study were r and omly selected . Knee OA at baseline and at followup ( mean followup 6.6 years ) was scored according to the Kellgren/Lawrence ( K/L ) grading system . Alignment was measured by the femorotibial angle on radiographs at baseline . Multivariable logistic regression for repeated measurements was used to analyze the association of malalignment with the development and progression of OA . RESULTS Of 2,664 knees , 1,012 ( 38 % ) were considered to have normal alignment , 693 ( 26 % ) had varus alignment , and 959 ( 36 % ) had valgus alignment . A comparison of valgus alignment and normal alignment showed that valgus alignment was associated with a borderline significant increase in development of knee OA ( odds ratio [ OR ] 1.54 , 95 % confidence interval [ 95 % CI ] 0.97 - 2.44 ) , and varus alignment was associated with a 2-fold increased risk ( OR 2.06 , 95 % CI 1.28 - 3.32 ) . Stratification for body mass index showed that this increased risk was especially seen in overweight and obese individuals but not in non-overweight persons . The risk of OA progression was also significantly increased in the group with varus alignment compared with the group with normal alignment ( OR 2.90 , 95 % CI 1.07 - 7.88 ) . CONCLUSION An increasing degree of varus alignment is associated not only with progression of knee OA but also with development of knee OA . However , this association seems particularly applicable to overweight and obese persons OBJECTIVES Foot orthoses are commonly used in the management of knee OA , although the relationship between foot function and knee OA is still unclear . The purpose of the study was to examine foot function during walking in people with and without medial compartment knee OA . METHODS Motion of the tibia , rearfoot and forefoot in 32 patients with medial compartment knee OA and 28 age-matched control subjects was investigated . Multivariate analysis was used to compare the groups . RESULTS The knee OA group contacted the ground with a more everted rearfoot , demonstrated greater peak rearfoot eversion and exhibited reduced rearfoot frontal plane range of motion and reduced rearfoot peak inversion . The tibia was more internally rotated and laterally tilted throughout the gait cycle , with reduced peak external rotation . CONCLUSION People with medial compartment knee OA exhibit altered foot kinematics during gait that are indicative of a less mobile , more everted foot type . The presence and degree of tibial malalignment and the available rearfoot range of motion during walking may affect individual responses to load-altering interventions , such as foot orthoses and footwear modifications . TRIAL REGISTRATION Australian New Zeal and Clinical Trials Registry , www.anzctr.org.au/ , ACTRN12608000116325 OBJECTIVE To compare age-related patterns of gait with patterns associated with knee osteoarthritis ( OA ) , the following hypotheses were tested : ( H1 ) The sagittal-plane knee function during walking is different between younger and older asymptomatic subjects ; ( H2 ) The age-related differences in H1 are increased in patients with knee OA . DESIGN Walking trials were collected for 110 participants ( 1.70 ± 0.09 m , 80 ± 14 kg ) . There were 29 younger asymptomatic subjects ( 29 ± 4 years ) and 81 older participants ( 59 ± 9 years ) , that included 27 asymptomatic subjects and 28 and 26 patients with moderate and severe medial knee OA . Discrete variables characterizing sagittal-plane knee function were compared among the four groups using ANOVAs . RESULTS During the heel-strike portion of the gait cycle at preferred walking speed , the knee was less extended and the shank less inclined in the three older groups compared to the younger asymptomatic group . There were similar differences between the severe OA group and the older asymptomatic and moderate OA groups . Both OA groups also had the femur less posterior relative to the tibia and smaller extension moment than the younger group . During terminal stance , the severe OA group had the knee less extended and smaller knee extension moment than the younger asymptomatic and older moderate OA groups . CONCLUSIONS The differences in knee function , particularly those during heel-strike which were associated with both age and disease severity , could form a basis for looking at mechanical risk factors for initiation and progression of knee OA on a prospect i ve basis OBJECTIVE To describe the association between chondral defects , bone marrow lesions , knee and hip radiographic osteoarthritis ( OA ) , and knee pain . METHODS Knee pain was assessed by the Western Ontario and McMaster Universities Osteoarthritis Index . T1- and T2-weighted fat saturation magnetic resonance imaging was performed on the right knee to assess chondral defects and subchondral bone marrow lesions . Radiography was performed on the right knee and hip and scored for radiographic OA . Body mass index ( BMI ) and knee extension strength were measured . RESULTS A total of 500 r and omly selected men and women participated . The prevalence of knee pain was 48 % . In multivariable analysis , prevalent knee pain was significantly associated with medial tibial chondral defects ( odds ratio [ OR ] 2.32 , 95 % confidence interval [ 95 % CI ] 1.02 - 5.28 for grade 3 versus grade 2 or less ; OR 4.93 , 95 % CI 1.07 - 22.7 for grade 4 versus grade 2 or less ) , bone marrow lesions ( OR 1.44 , 95 % CI 1.04 - 2.00 per compartment ) , and hip joint space narrowing ( OR 1.36 , 95 % CI 1.07 - 1.73 per unit ) , as well as greater BMI and lower knee extension strength . It was not significantly associated with radiographic knee OA . These variables were also associated with more severe knee pain . In addition , there was a dose response association between knee pain and number of sites having grade 3 or 4 chondral defects ( OR 1.39 , 95 % CI 1.12 - 1.73 per site ) , with all subjects having knee pain if all compartments of the knee had these defects . CONCLUSION Knee pain in older adults is independently associated with both full and non-full-thickness medial tibial chondral defects , bone marrow lesions , greater BMI , and lower knee extension strength , but is not associated with radiographic knee OA . The association between radiographic hip OA and knee pain indicates that referred pain from the hip needs to be considered in unexplained knee pain BACKGROUND Given the complexity of the gait of patients with knee osteoarthritis , a multiple correspondence analysis may be helpful to optimise the extraction of relevant gait and clinical information . Therefore , the aims of this study are to identify the main associations with clinical and gait biomechanical parameters and to evaluate whether there are more specific knee osteoarthritis groups with different gait profiles . METHODS Ninety patients with severe knee osteoarthritis and twenty-six healthy individuals participated in this study . Pain and function were assessed with the WOMAC Index ; knee joint deformity was assessed by the hip-knee-ankle angle on full-limb radiography ; and full body gait analysis was performed with a motion analysis system and force plates . FINDINGS Using multiple correspondence analysis , two categories of gait parameters that best explain the gait variance of patients with knee osteoarthritis were highlighted . The forward displacement category is composed of the parameters speed , stride length , hip flexion and knee flexion . The frontal category is composed of the parameters thorax obliquity and knee adductor moments . Moreover , based on these parameters , four distinct gait profiles were identified : two gait profiles were associated with knee varus deformities , increased thorax obliquity and different forward displacements , while two gait profiles were associated with valgus deformities and different forward displacements . INTERPRETATION These gait parameters can be used to simplify the characterisation of the gait of the knee osteoarthritis population . Patients in varus profiles increase thorax obliquity on the stance limb and may reduce forward displacement . Patients in valgus profiles , however , only reduce forward displacement OBJECTIVE To examine the relationship of knee malalignment to the occurrence of knee osteoarthritis ( OA ) among subjects without radiographic OA at baseline to determine whether malalignment is a risk factor for incident disease or simply a marker of increasing disease severity . METHODS We selected 110 incident tibiofemoral ( TF ) OA case knees ( 76 subjects ) and 356 r and om control knees ( 178 subjects ) from among participants in the Framingham Osteoarthritis Study . Case knees did not have OA at baseline ( 1992 - 1994 examination ) but had developed OA ( Kellgren/Lawrence grade > or=2 ) at followup ( 2002 - 2005 examination ) ( mean of 8.75 years between examinations ) . Control knees did not have OA at baseline . St and ardized digital radiographs of the fully extended knee with weight-bearing were read using a st and ard protocol and eFilm viewing software . We measured the anatomic axis , the condylar angle , the tibial plateau angle , and the condylar tibial plateau angle . The interobserver intraclass correlation coefficient ( ICC ) ranged from 0.93 to 0.96 and the intraobserver ICC from 0.94 to 0.97 . In a knee-specific analysis , we examined the relationship of each alignment measurement to the risk of TF OA using generalized estimating equations , adjusting for age , sex , and body mass index ( BMI ) . We used the same approach to assess the association between each alignment measurement and the risk of medial TF OA . RESULTS Subjects in the case population were older and had a higher BMI than the controls . The alignment values were normally distributed and were not different between the cases and the controls . After adjustment for age , sex and BMI , there was no significant increase in incident OA in the highest quartile compared with the lowest quartile category for any of the alignment measures ( P for trend for anatomic axis and condylar tibial plateau angle was 0.83 and 0.80 , respectively ) . Similar results were also observed for medial compartment OA . CONCLUSION We found that baseline knee alignment is not associated with either incident radiographic TF OA or medial TF OA . These results suggest that malalignment is not a risk factor for OA , but rather is a marker of disease severity and /or its progression OBJECTIVE To compare the gait of adults with unilateral and bilateral symptomatic and radiographic knee osteoarthritis ( OA ) to determine whether these subgroups can be treated similarly in the clinic and when recruiting for r and omized clinical trials , and to use these data to generate future hypotheses regarding gait in these subsets of knee OA patients . METHODS Cross-sectional investigation of patients with unilateral and bilateral knee OA on gait mechanics using 136 older adults ( age ≥55 yrs ; 27 kg m(-2 ) ≥ BMI ≤ 41 kg m(-2 ) ; 82 % female ) with radiographic knee OA . Comparisons were made between the most affected side of the bilateral group ( Bi ) and the affected side of the unilateral group ( Uni ) , and between symmetry indices of each group . RESULTS There were no significant differences in any temporal , kinematic , or kinetic measures between the Uni and Bi cohorts . Comparison of symmetry indices between groups also revealed no significant differences . CONCLUSION The similarity in lower extremity mechanics between unilateral and bilateral knee OA patients is sufficiently robust to consider both subsets as a single cohort . We hypothesize that biomechanical adaptations to knee OA are at least partially systemic in origin and not based solely on the physiological characteristics of an affected knee joint
11,078
30,567,547
Conclusion The heterogeneity of published research and limited quality of measurement tools yielded no conclusive evidence on the association of presenteeism with hypothesized exposures , economic costs , or interventions amongst hospital healthcare workers .
Background Presenteeism is a behavior in which an employee is physically present at work with reduced performance due to illness or other reasons . Hospital doctors and nurses are more inclined to exhibit presenteeism than other professional groups , result ing in diminished staff health , reduced team productivity and potentially higher indirect presenteeism-related medical costs than absenteeism . Robust presenteeism intervention programs and productivity costing studies are available in the manufacturing and business sectors but not the healthcare sector . This systematic review aims to 1 ) identify instruments measuring presenteeism and its exposures and outcomes ; 2 ) appraise the related workplace theoretical frameworks ; and 3 ) evaluate the association between presenteeism , its exposures and outcomes , and the financial costs of presenteeism as well as interventions design ed to alleviate presenteeism amongst hospital doctors and nurses .
Objectives This article investigates various reasons for sickness presenteeism ( SP ) , that is , going to work despite illness . The research questions asked is : What are the main reported reasons for SP in Norway and Sweden ? Design Cross-sectional survey in Norway and Sweden . Use of binomial logistic regression analysis . Participants A r and om sample of people aged between 20 and 60 years was obtained from complete and up date d data bases of the Norwegian and Swedish population s. A postal question naire was sent to the selected individuals , with response rate 33 % ( n=2843 ) . 2533 workers responded to questions about SP during the last 12 months . Primary and secondary outcome measures The article informs about the distribution of reasons for SP in Norway and Sweden , selected by the respondents from a closed list . The article also examines which factors influence the most often reported reasons for SP . Results 56 % of the Norwegian and Swedish respondents experienced SP in the previous year . The most frequently reported reasons for SP include not burden colleagues ( 43 % ) , enjoy work ( 37 % ) and feeling indispensable ( 35 % ) . A lower proportion of Norwegians state that they can not afford taking sick leave adjusted OR ( aOR 0.16 ( 95 % CI 0.10 to 0.22 ) ) , while a higher proportion of Norwegians refer to that they enjoy their work ( aOR=1.64 ( 95 % CI 1.28 to 2.09 ) ) . Women and young workers more often report that they do not want to burden their colleagues . Managers ( aOR=2.19 ( 95 % CI 1.67 to 2.86 ) ) , highly educated persons and the self-employed more often report that they are indispensable . Conclusions Positive and negative reasons for SP are reported , and there are significant differences between respondents from the two countries . The response rate is low and results must be interpreted with caution . Study design Cross-sectional study Background Burnout in healthcare is a worldwide problem . However , most studies focus narrowly on work-related factors and outcomes in one health profession or speciality . Aims To investigate the prevalence of burnout and its association with job dem and s , job re sources , individual well-being , work-related attitudes and behaviour in physicians and nurses across different specialties . Methods Multi-centre cross-sectional study of physicians and nurses working in Belgian hospitals . An electronic question naire was used to assess job dem and s ( e.g. workload ) , job re sources ( e.g. autonomy ) and indicators of well-being , work-related attitudes and behaviours . Structural equation modelling was used to examine interrelationships between explanatory variables and outcomes . Results 1169 physicians and 4531 nurses participated ; response rate 26 % . High scores ( > 75th percentile in reference group of Dutch health care workers ) were seen in 6 % of the sample on three burnout dimensions ( i.e. emotional exhaustion , depersonalization and personal competence ) and in 13 % for at least two dimensions . In contrast to the other dimensions , emotional exhaustion strongly related to almost all variables examined in the model . Positive associations were seen with workload , role conflicts , emotional burden and work-home interference and negative associations with learning and development opportunities and co-worker support . Emotional exhaustion correlated negatively with well-being , turnover intention , being prepared and able to work until retirement age , medication use , absenteeism and presenteeism . Conclusions Work-related factors were critical correlates of emotional exhaustion , which strongly related to poor health and turnover intention . R and omized controlled trials are suggested to underpin the effectiveness of interventions tackling job stressors and promoting job re sources OBJECTIVES Nurses are at elevated risk of burnout , anxiety and depressive disorders , and may then become less productive . This begs the question if a preventive intervention in the work setting might be cost-saving from a business perspective . MATERIAL AND METHODS A cost-benefit analysis was conducted to evaluate the balance between the costs of a preventive intervention among nurses at elevated risk of mental health complaints and the cost offsets stemming from improved productivity . This evaluation was conducted alongside a cluster-r and omized trial in a Dutch academic hospital . The control condition consisted of screening without feedback and unrestricted access to usual care ( N = 206 ) . In the experimental condition screen-positive nurses received personalized feedback and referral to the occupational physician ( N = 207 ) . RESULTS Subtracting intervention costs from the cost offsets due to reduced absenteeism and presenteeism result ed in net-savings of 244 euros per nurse when only absenteeism is regarded , and 651 euros when presenteeism is also taken into account . This corresponds to a return-on-investment of 5 euros up to 11 euros for every euro invested . CONCLUSIONS Within half a year , the cost of offering the preventive intervention was more than recouped . Offering the preventive intervention represents a favorable business case as seen from the employer 's perspective The aim of this study is to assess the impact of a broad range of possible factors relating to work , personal circumstances and attitudes towards sickness absence on a person 's decision to go to work despite feeling ill , a phenomenon that has been termed sickness presence ( SP ) , or ' presenteeism ' , in the literature . Using data from a r and om sample of 12,935 members from the core Danish work force the hypotheses were tested in a cross-sectional design utilising ordered logistic regression models . The results indicate that more than 70 % of the core work force goes ill to work at least once during a 12-month period . This means that SP is just as prevalent a phenomenon as sickness absence . Many of the results from earlier studies of SP were replicated and new factors were discovered : for example time pressure ( having a supervisory role and /or working more than 45 h per week ) and relationship with colleagues ( measured by working in a small company , having non-st and ard hours and degree of cooperation ) both increase the likelihood of SP . However , personal circumstances and attitudes , e.g. treating work as home ( cf . Hochschild 's thesis ) and being over-committed to work , were also found to lead to higher levels of SP . Finally , we found that those with a conservative attitude to absence were most likely to turn up ill at work . Overall , work-related factors seem to be slightly more important than personal circumstances or attitudes in determining people 's ' decision ' to go ill at work . However , the relatively low explanatory power of these combined factors suggests that there are still many unknowns in this field of research Objective Although research has been conducted on how nurse staffing levels affect outcomes , there has been little investigation into how the health-related productivity of nurses is related to quality of care . Two major causes of worker presenteeism ( reduced on-the-job productivity as a result of health problems ) are musculoskeletal pain and mental health issues , particularly depression . This study sought to investigate the extent to which musculoskeletal pain or depression ( or both ) in RNs affects their work productivity and self-reported quality of care and considered the associated costs . Methods Using a cross-sectional survey design , a r and om sample of 2,500 hospital-employed RNs licensed in North Carolina were surveyed using a survey instrument sent by postal mail . Specific measures included questions on individual and workplace characteristics , self-reported quality of care , and patient safety ; a numeric pain rating scale , a depression tool ( the Patient Health Question naire ) , and a presenteeism tool ( the Work Productivity and Activity Impairment Question naire : General Health ) were also incorporated . A total of 1,171 completed surveys were returned and used for analysis . Results Among respondents , the prevalence of musculoskeletal pain was 71 % ; that of depression was 18 % . The majority of respondents ( 62 % ) reported a presenteeism score of at least 1 on a 0-to-10 scale , indicating that health problems had affected work productivity at least “ a little . ” Pain and depression were significantly associated with presenteeism . Presenteeism was significantly associated with a higher number of patient falls , a higher number of medication errors , and lower quality -of-care scores . Baseline cost estimates indicate that the increased falls and medication errors caused by presenteeism are expected to cost $ 1,346 per North Carolina RN and just under $ 2 billion for the United States annually . Upper-boundary estimates exceed $ 9,000 per North Carolina RN and $ 13 billion for the nation annually . Conclusion More attention must be paid to the health of the nursing workforce to positively influence the quality of patient care and patient safety and to control costs We examined the effect of interview characteristics ( ie , recall interval , interview version ) on estimates of health-related lost productive work time ( LPT ) . Three versions of a telephone interview were administered using 7-day and 4-week recall periods . In a population -based survey , 7674 workers r and omly were assigned to one of six interviews at contact ; 615 participants received a follow-up interview . We found strong evidence of under-reporting using a 4-week recall period and a not significant trend in over-reporting LPT using a 7-day recall period . Of the three interviews , version 3 could be administered most quickly , on average , and yielded the most discriminating estimates of LPT by health condition ( ie , headache , allergic rhinitis , and cold/flu ) . Our data suggest that variation in relatively short recall periods influences estimates of health-related LPT . A 2-week recall period may be optimal for minimizing overall reporting error but requires additional research to verify OBJECTIVE Although major depression is thought to have substantial negative effects on work performance , the possibility of recall bias limits self-report studies of these effects . The authors used the experience sampling method to address this problem by collecting comparative data on moment-in-time work performance among service workers who were depressed and those who were not depressed . METHOD The group studied included 105 airline reservation agents and 181 telephone customer service representatives selected from a larger baseline sample ; depressed workers were deliberately over sample d. Respondents were given pagers and experience sampling method diaries for each day of the study . A computerized autodialer paged respondents at r and om time points . When paged , respondents reported on their work performance in the diary . Moment-in-time work performance was assessed at five r and om times each day over a 7-day data collection period ( 35 data points for each respondent ) . RESULTS Seven conditions ( allergies , arthritis , back pain , headaches , high blood pressure , asthma , and major depression ) occurred often enough in this group of respondents to be studied . Major depression was the only condition significantly related to decrements in both of the dimensions of work performance assessed in the diaries : task focus and productivity . These effects were equivalent to approximately 2.3 days absent because of sickness per depressed worker per month of being depressed . CONCLUSIONS Previous studies based on days missed from work significantly underestimate the adverse economic effects associated with depression . Productivity losses related to depression appear to exceed the costs of effective treatment BACKGROUND The specific job dem and s of working in a hospital may place nurses at elevated risk for developing distress , anxiety and depression . Screening followed by referral to early interventions may reduce the incidence of these health problems and promote work functioning . OBJECTIVE To evaluate the comparative cost-effectiveness of two strategies to promote work functioning among nurses by reducing symptoms of mental health complaints . Three conditions were compared : the control condition consisted of online screening for mental health problems without feedback about the screening results . The occupational physician condition consisted of screening , feedback and referral to the occupational physician for screen-positive nurses . The third condition included screening , feedback , and referral to e-mental health . DESIGN The study was design ed as an economic evaluation alongside a pragmatic cluster r and omised controlled trial with r and omisation at hospital-ward level . SETTING AND PARTICIPANTS The study included 617 nurses in one academic medical centre in the Netherl and s. METHODS Treatment response was defined as an improvement on the Nurses Work Functioning Question naire of at least 40 % between baseline and follow-up . Total per-participant costs encompassed intervention costs , direct medical and non-medical costs , and indirect costs stemming from lost productivity due to absenteeism and presenteeism . All costs were indexed for the year 2011 . RESULTS At 6 months follow-up , significant improvement in work functioning occurred in 20 % , 24 % and 16 % of the participating nurses in the control condition , the occupational physician condition and the e-mental health condition , respectively . In these conditions the total average annualised costs were € 1752 , € 1266 and € 1375 per nurse . The median incremental cost-effectiveness ratio for the occupational physician condition versus the control condition was dominant , suggesting cost savings of € 5049 per treatment responder . The incremental cost-effectiveness ratio for the e-mental health condition versus the control condition was estimated at € 4054 ( added costs ) per treatment responder . Sensitivity analyses attested to the robustness of these findings . CONCLUSIONS The occupational physician condition result ed in greater treatment responses for less costs relative to the control condition and can therefore be recommended . The e-mental health condition produced less treatment response than the control condition and can not be recommended as an intervention to improve work functioning among nurses AIM This paper is a report of a cohort study of healthcare workers ' work attendance , and its long-term consequences ' on health , burnout , work ability and performance . BACKGROUND Concepts and measures of work attendance have varied in the scientific literature . Attending work in spite of being sick can have serious consequences on health . There is little knowledge on which individual and work-related conditions that increase work attendance and the long-term impact on health and performance . METHOD Prospect i ve analyses of three measures of work attendance i.e. sickness attendance , uninterrupted long-term attendance and balanced attendance ( ≤7 days of sick leave per year and no sickness attendance ) were done using question naire data from a 2-year cohort study ( 2004 - 2006 ) of r and omly selected healthcare workers ( n = 2624 ) . Incentives ( e.g. effort-reward balance , social support , meaningfulness ) and requirements ( e.g. time-pressure , dutifulness , high responsibility ) to attend work as well as general health , burnout , sick leave , work ability and performance were assessed . RESULTS There was a positive relation between balanced work attendance and incentives , whereas high sickness attendance was associated with requirements . Follow up after 2 years showed that balanced attendance was associated with sustained health and performance while sickness attendance was associated with poor health , burnout , sick-leave and decreased performance . CONCLUSION It is important to distinguish between measures of work attendance as they differ in relation to incentives , and health- and performance-related consequences . Sickness attendance seems to be an important risk indicator . A balanced work attendance should be promoted for sustained health and performance in healthcare organisations Aims : The present study examined the differences between physicians working in public and private health care in strenuous working environments ( presence of occupational hazards , physical violence , and presenteeism ) and health behaviours ( alcohol consumption , body mass index , and physical activity ) . In addition , we examined whether gender or age moderated these potential differences . Methods : Cross-sectional survey data were compiled on 1422 female and 948 male r and omly selected physicians aged 25–65 years from The Finnish Health Care Professionals Study . Logistic regression and linear regression analyses were used with adjustment for gender , age , specialisation status , working time , managerial position , and on-call duty . Results : Occupational hazards , physical violence , and presenteeism were more commonly reported by physicians working in the public sector than by their counterparts in the private sector . Among physicians aged 50 years or younger , those who worked in the public sector consumed more alcohol than those who worked in the private sector , whereas in those aged 50 or more the reverse was true . In addition , working in the private sector was most strongly associated with lower levels of physical violence in those who were older than 50 years , and with lower levels of presenteeism among those aged 40–50 years . Conclusions : The present study found evidence for the public sector being a more strenuous work environment for physicians than the private sector . Our results suggest that public healthcare organisations should pay more attention to the working conditions of their employees
11,079
31,064,328
Reduction in depression severity was greater in studies with younger patients using either coil . The comparison between coils showed a larger reduction in depression severity in H1-coil vs. F8-coil studies ( independent of the study design or the concurrent pharmacotherapy ) and a trend towards higher remission rates in F8-coil vs. H1-coils studies . Conclusion When matched on frequency , the higher-intensity and less focal stimulation with the H1-coil reduces depression more than the lower-intensity and more focal stimulation with the F8-coil .
Background The current study aims to systematic ally assess and compare the antidepressant outcomes of repetitive transcranial magnetic stimulation ( rTMS ) with the figure-of-eight (F8)-coil and deep transcranial magnetic stimulation ( DTMS ) with the H1-coil in studies matched on stimulation frequency in unipolar major depressive disorder ( MDD ) .
BACKGROUND Multiple groups have reported on the use of repetitive transcranial magnetic stimulation ( rTMS ) in treatment-resistant major depression . The purpose of this study is to assess the efficacy of rTMS in unmedicated , treatment-resistant patients who meet criteria for major depression . METHODS Depressed subjects , who had failed to respond to a median of four treatment trials , were assigned in a r and omized double-blind manner to receive either active ( n = 10 ; 20 2-sec trains of 20 Hz stimulation with 58-sec intervals ; delivered at 80 % motor threshold with the figure-of-eight coil positioned over the left dorsolateral prefrontal cortex ) or sham ( n = 10 ; similar conditions with the coil elevated and angled 45 degrees tangentially to the scalp ) rTMS . These sequences were applied during 10 consecutive weekdays . Continuous electroencephalogram sampling and daily motor threshold determinations were also obtained . RESULTS The group mean 25-item Hamilton Depression Rating Scale ( HDRS ) score was 37.2 ( + /- 2.0 SEM ) points . Adjusted mean decreases in HDRS scores were 14.0 ( + /- 3.7 ) and 0.2 ( + /- 4.1 ) points for the active and control groups , respectively ( p < .05 ) . One of 10 subjects receiving active treatment demonstrated a robust response ( i.e. , HDRS decreased from 47 to 7 points ) ; three other patients demonstrated 40 - 45 % decreases in HDRS scores . No patients receiving sham treatment demonstrated partial or full responses . CONCLUSIONS A 2-week course of active rTMS result ed in statistically significant but clinical ly modest reductions of depressive symptoms , as compared to sham rTMS in a population characterized by treatment resistance Major depressive disorder ( MDD ) is a prevalent and disabling condition , and many patients do not respond to available treatments . Deep transcranial magnetic stimulation ( dTMS ) is a new technology allowing non-surgical stimulation of relatively deep brain areas . This is the first double-blind r and omized controlled multicenter study evaluating the efficacy and safety of dTMS in MDD . We recruited 212 MDD out patients , aged 22 - 68 years , who had either failed one to four antidepressant trials or not tolerated at least two antidepressant treatments during the current episode . They were r and omly assigned to monotherapy with active or sham dTMS . Twenty sessions of dTMS ( 18 Hz over the prefrontal cortex ) were applied during 4 weeks acutely , and then biweekly for 12 weeks . Primary and secondary efficacy endpoints were the change in the Hamilton Depression Rating Scale ( HDRS-21 ) score and response/remission rates at week 5 , respectively . dTMS induced a 6.39 point improvement in HDRS-21 scores , while a 3.28 point improvement was observed in the sham group ( p=0.008 ) , result ing in a 0.76 effect size . Response and remission rates were higher in the dTMS than in the sham group ( response : 38.4 vs. 21.4 % , p=0.013 ; remission : 32.6 vs. 14.6 % , p=0.005 ) . These differences between active and sham treatment were stable during the 12-week maintenance phase . dTMS was associated with few and minor side effects apart from one seizure in a patient where a protocol violation occurred . These results suggest that dTMS constitutes a novel intervention in MDD , which is efficacious and safe in patients not responding to antidepressant medications , and whose effect remains stable over 3 months of maintenance treatment Repetitive transcranial magnetic stimulation ( rTMS ) of the dorsolateral prefrontal cortex is a relatively non-invasive technique with putative therapeutic effects in major depression . However , the exact neurophysiological basis of these effects needs further clarification . Therefore , we studied the impact of ten daily sessions of left , dorsolateral prefrontal rTMS on motor cortical excitability , as revealed by transcranial magnetic stimulation-elicited motor-evoked potentials in 30 patients . As compared to the non-responders , responders ( 33 % ) showed changes in parameters pointing towards a reduced cortical excitability . These results suggest that repetitive transcranial magnetic stimulation of the dorsolateral , prefrontal cortex may have inhibitory effects on motor cortical neuronal excitability in patients with major depressive disorder . Furthermore , measurement of motor cortical excitability may be a useful tool for investigating and monitoring inhibitory brain effects of antidepressant stimulation techniques like rTMS Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more Abstract Objectives . Intensified repetitive transcranial magnetic stimulation ( rTMS ) applied to the left dorsolateral prefrontal cortex ( DLPFC ) may result in fast clinical responses in treatment resistant depression ( TRD ) . In these kinds of patients , subgenual anterior cingulate cortex ( sgACC ) functional connectivity ( FC ) seems to be consistently disturbed . So far , no de novo data on the relationship between sgACC FC changes and clinical efficacy of accelerated rTMS were available . Methods . Twenty unipolar TRD patients , all at least stage III treatment resistant , were recruited in a r and omized sham-controlled crossover high-frequency (HF)-rTMS treatment study . Resting-state ( rs ) functional MRI scans were collected at baseline and at the end of treatment . Results . HF-rTMS responders showed significantly stronger resting-state functional connectivity ( rsFC ) anti-correlation between the sgACC and parts of the left superior medial prefrontal cortex . After successful treatment an inverted relative strength of the anti-correlations was observed in the perigenual prefrontal cortex ( pgPFC ) . No effects on sgACC rsFC were observed in non-responders . Conclusions . Strong rsFC anti-correlation between the sgACC and parts of the left prefrontal cortex could be indicative of a beneficial outcome . Accelerated HF-rTMS treatment design s have the potential to acutely adjust deregulated sgACC neuronal networks in TRD patients BACKGROUND Electroconvulsive therapy ( ECT ) is an effective alternative for pharmacotherapy in treatment-resistant depressive patients , but the side effects limit its use . Transcranial magnetic stimulation ( TMS ) has been proposed as a refined alternative , but most studies do not indicate that TMS is as effective as ECT for severe depression . OBJECTIVE We propose that the limited effectiveness of st and ard TMS resides in its superficial effect on the cortex , although much of the pathophysiology of depression is associated with deeper and larger brain regions implicated in the reward system . Herein , we tested the effectiveness and safety of a novel TMS coil , the " H-coil , " which enables direct stimulation of deeper brain regions , at the expense of focality . METHODS We have studied the antidepressant and cognitive effects induced by 4 weeks of high-frequency ( 20 Hz ) repeated deep TMS ( DTMS ) over the prefrontal cortex ( PFC ) of 65 medication-free depressive patients , who have failed to benefit from prior medications . Patients were r and omly assigned to various treatment configurations , differing in stimulation intensity and laterality . Effects were assessed by the 24-item Hamilton depression rating scale ( HDRS-24 ) and several secondary outcome measures . RESULTS A significant improvement in HDRS scores was found when high , but not low , stimulation intensity was used . Several cognitive improvements were evident , and no treatment-related serious adverse events were observed . CONCLUSIONS DTMS over the PFC was found safe and effective in alleviating depression . The results accentuate the significance of deep , high-intensity stimulation over low , and serve as the first study to indicate the potential of DTMS in psychiatric and neurologic disorders BACKGROUND We tested whether transcranial magnetic stimulation ( TMS ) over the left dorsolateral prefrontal cortex ( DLPFC ) is effective and safe in the acute treatment of major depression . METHODS In a double-blind , multisite study , 301 medication-free patients with major depression who had not benefited from prior treatment were r and omized to active ( n = 155 ) or sham TMS ( n = 146 ) conditions . Sessions were conducted five times per week with TMS at 10 pulses/sec , 120 % of motor threshold , 3000 pulses/session , for 4 - 6 weeks . Primary outcome was the symptom score change as assessed at week 4 with the Montgomery-Asberg Depression Rating Scale ( MADRS ) . Secondary outcomes included changes on the 17- and 24-item Hamilton Depression Rating Scale ( HAMD ) and response and remission rates with the MADRS and HAMD . RESULTS Active TMS was significantly superior to sham TMS on the MADRS at week 4 ( with a post hoc correction for ine quality in symptom severity between groups at baseline ) , as well as on the HAMD17 and HAMD24 scales at weeks 4 and 6 . Response rates were significantly higher with active TMS on all three scales at weeks 4 and 6 . Remission rates were approximately twofold higher with active TMS at week 6 and significant on the MADRS and HAMD24 scales ( but not the HAMD17 scale ) . Active TMS was well tolerated with a low dropout rate for adverse events ( 4.5 % ) that were generally mild and limited to transient scalp discomfort or pain . CONCLUSIONS Transcranial magnetic stimulation was effective in treating major depression with minimal side effects reported . It offers clinicians a novel alternative for the treatment of this disorder OBJECTIVE Preliminary studies have indicated that daily left prefrontal repetitive transcranial magnetic stimulation might have antidepressant activity . The authors sought to confirm this finding by using a double-blind crossover design . METHOD Twelve depressed adults received in r and om order 2 weeks of active treatment ( repetitive transcranial magnetic stimulation , 20 Hz at 80 % motor threshold ) and 2 weeks of sham treatment . RESULTS Changes from the relevant phase baseline in scores on the 21-item Hamilton depression scale showed that repetitive transcranial magnetic stimulation significantly improved mood over sham treatment . During the active-treatment phase , Hamilton depression scale scores decreased 5 points , while during sham treatment the scores increased or worsened by 3 points . No adverse effects were noted . CONCLUSIONS These placebo-controlled results suggest that daily left prefrontal repetitive transcranial magnetic stimulation has antidepressant activity when administered at these parameters . Further controlled studies are indicated to explore optimal stimulation characteristics and location , potential clinical applications , and possible mechanisms of action It has been 30 years since the discovery that repeated electrical stimulation of neural pathways can lead to long-term potentiation in hippocampal slices . With its relevance to processes such as learning and memory , the technique has produced a vast literature on mechanisms of synaptic plasticity in animal models . To date , the most promising method for transferring these methods to humans is repetitive transcranial magnetic stimulation ( rTMS ) , a noninvasive method of stimulating neural pathways in the brain of conscious subjects through the intact scalp . However , effects on synaptic plasticity reported are often weak , highly variable between individuals , and rarely last longer than 30 min . Here we describe a very rapid method of conditioning the human motor cortex using rTMS that produces a controllable , consistent , long-lasting , and powerful effect on motor cortex physiology and behavior after an application period of only 20 - 190 Background The aim of the current study was to investigate the cognitive correlates of repetitive transcranial magnetic stimulation ( rTMS ) in 10 treatment-resistant depression patients . Methods Patients received forty 20-min sessions of fast-frequency ( 10 Hz ) rTMS of the left dorsolateral prefrontal cortex ( DLPFC ) over 20 days . Concept-shift ability ( accuracy and duration of performance ) was assessed daily with a Modified Concept-Shifting Task ( mCST ) in patients and in eight healthy volunteers . General cognitive functioning test ( Repeatable Battery for the Assessment of Neuropsychological Status ; RBANS ) , Beck Depression Inventory ( BDI ) and Hamilton Depression Rating Scale ( HAM-D ) were applied before the first and after the last rTMS . Results Compared to before rTMS on the first 10 days , the patients performed the mCST significantly more accurately after rTMS on the last 10 days ( p < .001 , partial eta squared=.78 ) while the same comparison in healthy volunteers was not statistically significant ( p = .256 , partial eta squared=.18 ) . A significant improvement in immediate memory on RBANS and reduction in BDI and HAM-D scores were also observed after the last compared to before the first rTMS . Conclusion The rTMS is associated with an improvement in selective cognitive functions that is not explained by practice effects on tasks administered repeatedly . Trial registration Name : " Repetitive Transcranial Magnetic Stimulation ( rTMS ) in the treatment of depression , assessed with HAM-D over a four week period."URL : www.actr.org.au Registration number : R and omized controlled trials support the antidepressant efficacy of transcranial magnetic stimulation ( TMS ) ; however , there is individual variability in the magnitude of response . Examination of response predictors has been hampered by method ological limitations such as small sample sizes and single-site study design s. Data from a multisite sham-controlled trial of the antidepressant efficacy of TMS provided an opportunity to examine predictors of acute outcome . An open-label extension for patients who failed to improve provided the opportunity for confirmatory analysis . Treatment was administered to the left dorsolateral prefrontal cortex at 10 pulses per second , 120 % of motor threshold , for a total of 3000 pulses per day . Change on the Montgomery – Asberg Depression Rating Scale after 4 weeks was the primary efficacy outcome . A total of 301 patients with nonpsychotic unipolar major depression at 23 centers were r and omized to active or sham TMS . Univariate predictor analyses showed that the degree of prior treatment resistance in the current episode was a predictor of positive treatment outcome in both the controlled study and the open-label extension trial . In the r and omized trial , shorter duration of current episode was also associated with a better outcome . In the open-label extension study , absence of anxiety disorder comorbidity was associated with an improved outcome , but duration of current episode was not . The number of prior treatment failures was the strongest predictor for positive response to acute treatment with TMS . Shorter duration of current illness and lack of anxiety comorbidity may also confer an increased likelihood of good antidepressant response to TMS CONTEXT Daily left prefrontal repetitive transcranial magnetic stimulation ( rTMS ) has been studied as a potential treatment for depression , but previous work had mixed outcomes and did not adequately mask sham conditions . OBJECTIVE To test whether daily left prefrontal rTMS safely and effectively treats major depressive disorder . DESIGN Prospect i ve , multisite , r and omized , active sham-controlled ( 1:1 r and omization ) , duration -adaptive design with 3 weeks of daily weekday treatment ( fixed-dose phase ) followed by continued blinded treatment for up to another 3 weeks in improvers . SETTING Four US university hospital clinics . PATIENTS Approximately 860 out patients were screened , yielding 199 antidepressant drug-free patients with unipolar nonpsychotic major depressive disorder . INTERVENTION We delivered rTMS to the left prefrontal cortex at 120 % motor threshold ( 10 Hz , 4-second train duration , and 26-second intertrain interval ) for 37.5 minutes ( 3000 pulses per session ) using a figure-eight solid-core coil . Sham rTMS used a similar coil with a metal insert blocking the magnetic field and scalp electrodes that delivered matched somatosensory sensations . MAIN OUTCOME MEASURE In the intention-to-treat sample ( n = 190 ) , remission rates were compared for the 2 treatment arms using logistic regression and controlling for site , treatment resistance , age , and duration of the current depressive episode . RESULTS Patients , treaters , and raters were effectively masked . Minimal adverse effects did not differ by treatment arm , with an 88 % retention rate ( 90 % sham and 86 % active ) . Primary efficacy analysis revealed a significant effect of treatment on the proportion of remitters ( 14.1 % active rTMS and 5.1 % sham ) ( P = .02 ) . The odds of attaining remission were 4.2 times greater with active rTMS than with sham ( 95 % confidence interval , 1.32 - 13.24 ) . The number needed to treat was 12 . Most remitters had low antidepressant treatment resistance . Almost 30 % of patients remitted in the open-label follow-up ( 30.2 % originally active and 29.6 % sham ) . CONCLUSION Daily left prefrontal rTMS as monotherapy produced statistically significant and clinical ly meaningful antidepressant therapeutic effects greater than sham . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00149838 Background Deep transcranial magnetic stimulation ( DTMS ) is a non-invasive method of stimulating widespread cortical areas and , presumably , deeper neural networks . The current study assessed the effects of DTMS in the treatment of substance use disorders ( SUD ) using a systematic review . Methods Electronic literature search ( PsycInfo , Medline until April 2017 ) identified k = 9 studies ( k = 4 r and omized-controlled trials , RCTs , with inactive sham and k = 5 open-label studies ) . DTMS was most commonly applied using high frequency/intensity ( 10–20 Hz/100–120 % of the resting motor threshold , MT ) protocol s for 10–20 daily sessions in cases with alcohol , nicotine or cocaine use disorders . The outcome measures were craving and dependence ( according to st and ardized scales ) or consumption ( frequency , abstinence or results of biological assays ) at the end of the daily treatment phases and at the last follow-up . Results Acute and longer-term ( 6–12 months ) reductions in alcohol craving were observed after 20 sessions ( 20 Hz , 120 % MT ) relative to baseline in k = 4 open-label studies with comorbid SUD and major depressive disorder ( MDD ) . In k = 2 RCTs without MDD , alcohol consumption acutely decreased after 10–12 sessions ( 10–20 Hz , 100–120 % MT ) relative to baseline or to sham . Alcohol craving was reduced only after higher frequency/intensity DTMS ( 20 Hz , 120 % MT ) relative to sham in k = 1 RCT . Nicotine consumption was reduced and abstinence was increased after 13 sessions ( 10 Hz , 120 % MT ) and at the 6-month follow-up relative to sham in k = 1 RCT . Cocaine craving was reduced after 12 sessions ( 15 Hz , 100 % MT ) and at the 2-month follow-up relative to baseline in k = 1 open-label study while consumption was reduced after 12 sessions ( 10 Hz , 100 % MT ) relative to baseline but not to sham in k = 1 RCT . Conclusions High-frequency DTMS may be effective at treating some SUD both acutely and in the longer-term . Large RCTs with inactive sham are required to determine the efficacy and the optimal stimulation parameters of DTMS for the treatment of SUD BACKGROUND Controverted results have been obtained using high frequency transcranial magnetic stimulation ( HF-rTMS ) as an antidepressant treatment . METHODS Forty patients suffering from drug-resistant major depression received ten sessions of HF-rTMS at 90 % of the motor threshold on the left prefrontal cortex or sham stimulation , added to their pharmacological treatment , in a r and omized double-blind design . In a second open phase , patients still fulfilling criteria of inclusion received ten additional sessions of HF-rTMS at 90 or 110 % . RESULTS Real , but not sham HF-rTMS , was associated with a significant decrease in the Hamilton Depression Rating Scale , but only twelve patients decreased more than 50 % . CONCLUSIONS Left prefrontal HF-rTMS was effectively associated with antidepressant treatment , although the size effect was small . LIMITATIONS Shortage of the sample and control difficulties of the placebo effect . CLINICAL RELEVANCE Question able in more than half of the patients studied Repetitive transcranial magnetic stimulation ( rTMS ) is being investigated as an alternative treatment for depression . However , little is known about the clinical role and the neurophysiological mechanisms of the action of rTMS in these patients . In this study , 99mTc-HMPAO single photon emission computed tomography ( SPECT ) was used to map the effects of left dorsolateral prefrontal rTMS on prefrontal activity in seven patients who met DSM-IV criteria for major depression resistant to pharmacological treatment . rTMS consisted of 30 trains of 2-s duration stimuli ( 20 Hz , 90 % of motor threshold ) , separated by 30-s pauses . Each patient underwent three SPECTs : at baseline ; during the first rTMS ; and 1 week after 10 daily sessions of rTMS . Regional cerebral blood flow ( rCBF ) of each cerebral region was normalized to the rCBF value in the cerebellum and relative changes in normalized rCBF were addressed using a region-of-interest analysis . The Hamilton Depression Rating Scale ( HDRS ) was used for clinical evaluation before and after rTMS . A significant rCBF increase after the 10 sessions of rTMS was found in the left prefrontal region ( MANOVA F=5.29 , d.f.=2,10 , P=0.027 ) , but no significant rCBF changes were found during the first rTMS session . The remaining cerebral regions showed no significant rCBF changes at any time . Only two patients showed a clinical improvement after rTMS , with 50 % reduction of the initial HDRS score . The study was repeated under placebo conditions ( identical design but addressing coil discharges to the air ) in these two patients , who failed to show any rCBF increase during sham-rTMS . No relationship was found between the percentage of left prefrontal rCBF change and the clinical findings . In conclusion , rTMS of the left prefrontal cortex induces a significant rCBF increase in this region , despite the limited clinical effect in our sample of depressed patients . Cerebral perfusion SPECT is a useful tool to map cerebral activity changes induced by rTMS Stimulation parameters seem to strongly influence the efficacy of repetitive transcranial magnetic stimulation ( rTMS ) in the management of treatment-resistant depressed patients . The most effective and safest parameters are yet to be defined . Moreover , systematic follow-up data available to document the duration of the therapeutic effects remain sparse . Twenty-one treatment-resistant depressed patients were r and omized to either active rTMS ( N=12 ) or to sham ( N=9 ) treatment in a double-blind design . Patients were kept on their medications . Sub-motor-threshold ( MT ) stimulation ( 80 % MT ) was delivered for 10 consecutive work days ( 20 Hz , 2-s trains , 20 trains ) . Subjects meeting pre-set criteria for responding were entered into a follow-up phase for up to 5 months . Utilizing the above stimulation parameters , we found no significant difference between groups . Six patients in the active group and one subject in the sham group met criteria for the follow-up phase . The period of time before subjects met criteria for relapse was highly variable ranging from 2 to 20 weeks . Sub-threshold rTMS stimulation for 2 weeks is not significantly superior to sham treatment for treatment-resistant depressed patients . The duration of the therapeutic effects of rTMS delivered to the left prefrontal cortex using the above-described parameters is highly variable A growing number of studies report antidepressant effects of repetitive transcranial magnetic stimulation ( rTMS ) in patients with major depression . The hypothesis that high frequency ( 20 Hz ) rTMS ( HF-rTMS ) may speed up and strengthen the therapeutic response to sertraline in MD was tested . Twenty eight patients who had not yet received medication for the present depressive episode ( n=12 ) or had failed a single trial of an antidepressant medication ( n=16 ) were started on sertraline and r and omised to receive either real of sham HF-rTMS . HF-rTMS was applied to the left dorsolateral prefrontal area in daily sessions ( 30 trains of 2 s , 20–40 s intertrain interval , at 90 % motor threshold ) on 10 consecutive working days . The results suggest that in this patient population , HF-rTMS does not add efficacy over the use of st and ard antidepressant medication OBJECTIVE To acquire information about the physical properties and physiological effects of the H-coil . METHODS We used a robotized system to measure the electric field ( E-field ) generated by a H-coil prototype and compared it with a st and ard figure-of-eight coil . To explore the physiological properties of the coils , input/output curves were recorded for the right abductor digiti minimi muscle ( ADM ) as target muscle . To explore focality of stimulation , simultaneous recordings were performed for the left ADM , right abductor pollicis brevis ( APB ) , extensor digitorum communis ( EDC ) and biceps brachii ( BB ) muscles . RESULTS Physical measurements of the H-coil showed four potentially stimulating foci , generating different electric field intensities along two different spatial orientations . RMT was significantly lower for H-coil- as compared to figure-of-eight coil stimulation . When stimulation intensity for the input-output curve was determined by percent of maximum stimulator output , the H-coil produced larger MEPs in the right ADM , as compared to the figure-of-eight coil , due to the larger relative enhancement of stimulation intensity of the H-coil . When stimulation intensity was adjusted to RMT , MEPs elicited at the right ADM were larger for figure-of-eight coil than for H-coil stimulation , while this relation was reversed for distant non-target muscles , with low stimulation intensities . With high stimulation intensities , the H-coil elicited larger MEPs for all tested muscles . Onset latency of the MEPs was never shorter for H-coil than for figure-of-eight coil stimulation of the target muscles . CONCLUSIONS These results are in favor for a non-focal , but not deeper effect of the H-coil , as compared to a figure-of-eight coil . SIGNIFICANCE This is the first neurophysiological study exploring the focality and depth of stimulation delivered by the H-coil systematic ally in humans . We found no advantage of this coil with regard to depth of stimulation in comparison to the figure-of-eight coil . Future studies have to show if the non-focality of this coil differs relevantly from that of other non-focal coils , e.g. the round coil BACKGROUND An acute course of dTMS typically involves treatments delivered 5 days a week , for 4 weeks . Should more treatments be given if the patient has not responded ? Data are needed to inform decisions about the best next steps for acute non-responders . OBJECTIVE To characterize response among acute-phase non-responders in a r and omized controlled trial of deep repetitive transcranial magnetic stimulation ( dTMS ) monotherapy for medication-resistant depression . METHODS Summary statistics and Kaplan-Meier curves were used to characterize outcomes of 33 medication-free Brainsway ™ dTMS non-responders to double blind but active treatment at the end of 4 weeks ( 20 sessions ) , who then continued double blind but active twice-weekly treatment for up to 12 additional weeks . RESULTS 24 participants ( 72.7 % ) achieved responder status during at least one rating with dTMS continuation -- 20 ( 60.6 % ) within four weeks , with 13 ( 39.4 % ) consistently meeting response criteria for the duration of the study . 20 ( 63.6 % ) achieved remission status at some point during treatment continuation . CONCLUSIONS A significant proportion of acute course non-responders to dTMS treatment eventually respond with continued treatment . Continuing TMS treatment beyond the acute course for non-responders may result in eventual response in over half of these individuals INTRODUCTION Co-occurrence of Major Depressive ( MDD ) and Alcohol Use Disorders ( AUDs ) is frequent , causing more burden than each disorder separately . Since the dorsolateral prefrontal cortex ( DLPFC ) is critically involved in both mood and reward and dysfunctional in both conditions , we aim ed to evaluate the effects of dTMS stimulation of bilateral DLPFC with left prevalence in patients with MDD with or without concomitant AUD . METHODS Twelve MDD patients and 11 with concomitant MDD and AUD ( MDD+AUD ) received 20 dTMS sessions . Clinical status was assessed through the Hamilton Depression Rating Scale ( HDRS ) and the Clinical Global Impressions severity scale ( CGIs ) , craving through the Obsessive Compulsive Drinking Scale ( OCDS ) in MDD+AUD , and functioning with the Global Assessment of Functioning ( GAF ) . RESULTS There were no significant differences between the two groups in sociodemographic ( age , sex , years of education and duration of illness ) and baseline clinical characteristics , including scores on assessment scales . Per cent drops on HDRS and CGIs scores at the end of the sessions were respectively 62.6 % and 78.2 % for MDD+AUD , and 55.2 % and 67.1 % for MDD ( p<0.001 ) . HDRS , CGIs and GAF scores remained significantly improved after the 6-month follow-up . HDRS scores dropped significantly earlier in MDD+AUD than in MDD LIMITATIONS : The small sample size and factors inherent to site and background treatment may have affected results . CONCLUSIONS High frequency bilateral DLPFC dTMS with left preference was well tolerated and effective in patients with MDD , with or without AUD . The antidepressant effect of dTMS is not affected by alcohol abuse in patients with depressive episodes . The potential use of dTMS for mood modulation as an adjunct to treatment in patients with a depressive episode , with or without alcohol abuse , deserves further investigation
11,080
15,674,923
There was no significant difference in need for recatheterisation , although recatheterisation after removal at night was more likely to be during working hours . There is suggestive but inconclusive evidence of a benefit from midnight removal of the indwelling urethral catheter . The evidence also suggests shorter hospital stay after early rather than delayed catheter removal but the effects on other outcomes are unclear . There is little evidence on which to judge other aspects of management , such as catheter clamping
BACKGROUND Approximately 15 % to 25 % of all hospitalised patients have indwelling urethral catheters , mainly to assist clinicians to accurately monitor urine output during acute illness or following surgery , to treat urinary retention , and for investigative purpose s. OBJECTIVES The objective of this review was to determine the best strategies for the removal of catheters from patients with a short-term indwelling urethral catheter .
OBJECTIVE To investigate whether early catheter removal following transurethral prostatectomy ( TURP ) is safe and whether it has any effect on the length of hospital stay . PATIENTS AND METHODS Following transurethral prostatectomy 59 patients were r and omized into one of two groups : those whose catheter was removed on day 1 after surgery and those whose catheter was removed on day 2 . The incidence of complications and the duration of post-operative hospital stay were assessed . RESULTS Catheter removal on day 1 led to a significantly shorter post-operative hospital stay ( 2.3 days versus 3.3 days ) and did not incur a higher incidence of complications . CONCLUSIONS Removal of the catheter on the first day following TURP is safe in selected patients and leads to a shorter post-operative hospital stay In the fields of both nursing and medicine there is a dearth of published literature on the optimum time to remove indwelling urinary catheters ( IDCs ) following urological surgery . Tradition seems to be in favour of removing IDCs at 0600 hours despite a lack of evidence to support this practice . This study was undertaken to determine whether midnight removal of IDCs result ed in patients ' resuming normal voiding patterns . A prospect i ve clinical trial was conducted to determine the impact midnight removal of urinary catheters would have on the patients ' voiding pattern , and subsequent discharge from hospital . One hundred and sixty patients were entered into the study . The patients were allocated at r and om to have their urinary catheter removed either at midnight or at 0600 hours . Patients who had their catheters removed at midnight passed a greater volume of urine with both their first ( 268 ml compared with 177 ml ; P<0.0001 ) and second voids ( 322 ml compared with 195 ml ; P<0.0001 ) than their counterparts in the 0600 group . This permitted earlier discharge from hospital . The results reported in this study support the findings of earlier research that midnight removal of IDC leads to an earlier resumption of normal voiding patterns , permits earlier discharge from hospital and appears to reduce patients ' anxiety . The recommendation from this study is that there should be a change in hospital policy so that the majority of IDCs are removed at midnight Patients who had undergone bladder neck surgery were r and omized to having their urethral catheters removed either early in the morning or late at night . There was no difference in the incidence of urinary retention between these two groups of patients . However , patients who presented with acute urinary retention had a higher incidence of postoperative urinary retention . This study suggests that a urethral catheter may be safely removed in the evening without increasing the risk of urinary retention . There also seems to be no greater chance of the patient having to be recatheterized at an unsocial hour OBJECTIVES Postoperative urethral catheter drainage after radical prostatectomy is bothersome to patients . A pilot study was initiated to determine if urethral catheter removal prior to hospital discharge is feasible . METHODS Thirty-three consecutive men undergoing radical retropubic prostatectomy were prospect ively studied and followed for a minimum of 6 months ( mean , 8.5 ) . Postoperative cystography was utilized to direct early catheter removal . RESULTS Of 33 patients , 27 ( 82 % ) underwent successful catheter removal at a mean of 4.2 postoperative days . No patient experienced urinary retention , urinoma development , pelvic abscess , or anastomotic stricture . Urinary continence is excellent ( no pads required ) in 70 % and good ( stress incontinence requiring 1 to 2 pads/24 hours ) in 18 % of patients at last follow-up . CONCLUSIONS Following radical prostatectomy , early catheter removal prior to hospital discharge is feasible . Early results suggest no deleterious consequences . Prospect i ve monitoring of more patients is needed to determine if this practice is widely applicable The interval before removal of the catheter used in prostatic transurethral surgery depends to a great extent on the surgeon , with a frequently empirical orientation . We conducted a prospect i ve , r and omized and controlled study of 213 patients who underwent transurethral surgery for benign prostatic hyperplasia . The catheter was removed systematic ally 24 hours after transurethral incision and 48 hours after transurethral resection of the prostate ( group 1 - 52 and 54 patients , respectively ) or the catheterization interval was determined by each surgeon in accordance with the usual criteria ( group 2 - 52 and 55 patients , respectively ) . No statistically significant differences were noted between these 2 groups in regard to complications . We conclude that systematic removal of the catheter at the aforementioned periods is cost-effective , safe and comfortable for the patient This study shows that removal of urinary catheters at midnight has several advantages over removal at 6 AM . The midnight group had a significantly greater initial voided volume and a longer time to first void than the equivalent 6 AM group . Advantages to midnight catheter removal also exist for nursing staff . Midnight tends to be less busy on the nursing unit compared with 6 AM , thus making it a preferable time for performance of routine tasks . Catheter removal at midnight also allows for convenient observation of patient voiding and assessment earlier in the day . This means that any necessary intervention can take place during working hours when more staff are on duty . There is also the potential for earlier discharge , with economic benefits related to shorter bed stay and more efficient discharge planning . We believe midnight catheter removal offers considerable benefits over the traditional 6 AM time on both general and urology units BACKGROUND Voiding dysfunction is frequently observed after rectal resection and justifies urinary drainage . However , there is no agreement about the optimal duration of this postoperative drainage . The aim of this controlled trial was to compare 1 versus 5 days of transurethral catheterization after rectal resection , with special reference to urinary tract infection and bladder retention . METHODS One hundred twenty-six patients undergoing rectal resection were included in a prospect i ve r and omized study design ed to compare the results for patients undergoing 1 day of transurethral catheterization after rectal resection ( 1-day group ) with those for patients undergoing 5 days ' catheterization ( 5-day group ) . RESULTS Patients were r and omly assigned to the 1-day and 5-day groups ( n = 64 and 62 , respectively ) . Clinical findings and surgical procedures were comparable in both groups . Acute urinary retention occurred in 16 patients ( 25 % ) in the 1-day group versus 6 ( 10 % ) in the 5-day group ( P < .05 ) . Urinary tract infection was observed in 13 of 64 patients ( 20 % ) in the 1-day group versus 26 of 62 ( 42 % ) in the 5-day group ( P < .01 ) . Multivariate analysis revealed that after 1 day of catheterization carcinoma of the low rectum and lymph node metastasis were significant risk factors for acute urinary retention ( P < .05 for both factors ) . After selection of patients without low rectum carcinoma , the acute urinary retention rate was comparable in both groups ( 14 % in the 1-day group versus 7 % in the 5-day group ) , but the urinary tract infection rate was significantly lower in the 1-day group versus the 5-day group ( 14 % vs 40 , P < .01 ) . CONCLUSIONS Our controlled study showed that after rectal resection 1 day of urinary drainage can be recommended for most patients . Five-day drainage should be reserved for patients with low rectal carcinoma OBJECTIVES Shortening hospital stay yet not compromising quality of care can result in significant cost savings for children undergoing surgical correction of vesicoureteral reflux . METHODS We review ed the medical records of pediatric patients who underwent ureteroneocystostomy between July 1995 and July 1997 . A total of 43 patients , aged 0.2 to 18 years ( mean 5.2 ) who all received identical postoperative care , except for their pain management and the time of bladder catheter removal , were included in the study . Twenty-three were treated with intravenous ketorolac tromethamine ( Toradol ) ; the remaining 20 received narcotics in the immediate postoperative period . The bladder catheter was removed in less than 24 hours in 22 children , and greater than 24 hours in 21 . RESULTS Patients who received ketorolac tromethamine for postoperative analgesia had on average shorter hospital length of stays than those treated with narcotics ( 1.4 versus 2.5 days , respectively ; P < 0.001 ) . The average stay for children whose bladder catheter was removed within 24 hours postoperatively was significantly shorter than those whose catheter was removed after a 24-hour period ( 1.4 versus 2.4 days , respectively ; P < 0.001 ) . There were no reimplantation failures . One child presented 2 days postoperatively with anemia , which did not require transfusion . CONCLUSIONS Our review demonstrates that ketorolac tromethamine can be used safely and effectively in children for immediate postoperative analgesia , and that its proper use combined with early catheter removal can reduce the length of hospital stay for pediatric patients undergoing ureteroneocystostomy PURPOSE We prospect ively tested the safety of routine removal of the catheter as early as 2 to 4 days after laparoscopic radical prostatectomy . MATERIAL S AND METHODS Between March 1998 and March 2001 , 228 patients underwent laparoscopic radical prostatectomy for clinical ly organ confined prostate cancer . The last 113 consecutive patients were included in a prospect i ve study according to gravitational cystography performed 2 to 4 days postoperatively . If no leak was seen the catheter was removed . If a leak was apparent the catheter was left indwelling for another 6 days and cystography was repeated . RESULTS Cystography 2 to 4 days postoperatively showed an anastomosis without a leak in 96 ( 84.9 % ) patients who subsequently had the catheters removed . There were 28 patients who had the catheter removed on postoperative day 2 , 28 day 3 and 40 day 4 . In 17 ( 15.1 % ) patients an anastomotic leak was observed , and the catheter was not removed at that time . Of the 96 patients in whom the catheter was removed early 10 ( 10.4 % ) had urinary retention that necessitated re-catheterization . This procedure was performed without the need for cystoscopy . After the catheter was removed all patients were able to void 24 hours later . Median followup was 7 months ( range 1 to 15 ) and showed continence rates greater than 93 % . No anastomotic stricture , pelvic abscess or urinoma developed in any patient . CONCLUSIONS Patients who undergo laparoscopic radical prostatectomy can have the catheter safely removed 2 to 4 days postoperatively without a higher risk of incontinence , stricture or leak related problems OBJECTIVE To compare three methods for a trial of micturition ( TOM ) ( the midnight removal of the catheter , dawn removal , and a new infusion method ) in a r and omized prospect i ve study . PATIENTS AND METHODS A total of 118 consecutive patients who had undergone transurethral resection of the prostate ( TURP ) or bladder neck incision ( BNI ) underwent TOM by one of the three methods . In the infusion method , the bladder was filled at a fast-drip rate via the catheter from a bag of normal saline connected by an intravenous supply set . The catheter was then removed , the patient voided and the volume was measured . From the volume of saline remaining , it was possible to calculate the residual volume in the patient . RESULTS The infusion TOM took a mean 13 h less than the other two methods , which were statistically indistinguishable . CONCLUSION The infusion TOM is safe and simple , is quick to carry out and can be performed at any time . It establishes the completeness of bladder emptying , which helps in the assessment of voiding PURPOSE We tested the hypothesis that early catheter removal may be accomplished safely after radical prostatectomy . MATERIAL S AND METHODS Cystography on postoperative day 4 or 5 in 42 of 67 consecutive patients who underwent radical retropubic prostatectomy revealed no extravasation in 30 and the urethral catheter was removed ( group 1 ) . The control group included 25 patients who did not undergo cystography , and the catheter was removed 14 days postoperatively ( group 2 ) . RESULTS Immediate and late continence was achieved in 14 ( 46.7 % ) and 25 ( 83.3 % ) cases in group 1 , and in 8 ( 32 % ) and 22 ( 88 % ) cases in group 2 , respectively ( p>0.05 ) . Catheterization was performed easily without any endoscopic or surgical procedure in 2 patients ( 6.7 % ) in group 1 who presented in urinary retention after catheter removal . Wound infection and pelvic abscess developed in 1 case ( 3.3 % ) . There were no late complications . In group 2 urinary retention developed in 1 patient ( 4 % ) , wound infection in 1 ( 4 % ) and hematuria in 1 ( 4 % ) . Two patients ( 8 % ) had late vesical neck contracture at 4 and 10 months , respectively , which required urethrotomy in 1 . In 1 patient ( 4 % ) a stricture in the anterior urethra was dilated . CONCLUSIONS Our study shows that early catheter removal may be accomplished safely in most patients after radical retropubic prostatectomy , and was not associated with a higher complication rate This prospect i ve study was done to see if reducing transurethral Foley catheterization from 3 days to 1 would lead to fewer urinary tract infections without an increase in voiding problems . Ninety-one women undergoing retropubic surgery for stress urinary incontinence ( Burch or Marshall-Marchetti-Krantz ) were r and omized to either 1 or 3 days ' catheterization . Antibiotics were not used . Infection was diagnosed in 9 ( 20.0 % ) patients in the 1-day group and in 16 ( 34.8 % ) in the 3-day group . Delayed voiding occurred in 13 ( 28.9 % ) and 10 ( 21.7 % ) patients , respectively , and 5 ( 11.1 % ) and 3 ( 6.5 % ) , respectively , received a new catheter . The differences do not reach statistical significance . Therefore , catheter time may safely be reduced to 1 day . This may lead to fewer infections but also somewhat more voiding problems . If a transurethral catheter is to be used , on balance the two regimens are equivalent The post surgical urinary retention syndrome is a frequent problem after vaginal surgery . In many medical centers it is used a transurethral vesical drainage for three to five days with or without vesical reeducation to prevent it . In order to determine the importance of the time of drainage and vesical reeducation in the presence of this syndrome 106 patients su bmi tted to vaginal surgery were studied at r and om and prospect ively , in our service . Patients were distributed in three groups : the first one , with 37 women in which the drainage was withdrawn at 24 hours ; in the second group it was retired at 72 hours and in the third group the drainage was removed at 72 hours with previous vesical reeducation . The results show that those patients who were less time under vesical drainage presented a minor frequency of urinary retention after surgery ( 24.3 % vs 30.7 % and 43.7 % ) A r and omised controlled trial was undertaken to determine the effects of midnight removal of urinary catheters on patients ' voiding patterns and subsequent discharge from hospital . Patients whose urinary catheters were removed at midnight showed a greater volume of initial void than those whose catheters were removed at the usual time of 0600 . Removal of urinary catheters routinely at midnight permits earlier assessment of patients ' voiding , which may allow for earlier discharge from hospital In a r and omized trial , treatment of urethral stricture by direct visual internal urethrotomy with 3 days of postoperative catheterisation was found to be sufficient , given minimal complications , and had a cure rate of 85 % after 6 months and 80 % after 1 year follow-up . The postoperative follow-up should not include urethrography , but patient 's statement and maximal urinary flow rate might be adequate This study was design ed to compare the effectiveness of two approaches to urinary catheter management in controlling postoperative urinary dysfunction in 110 patients following abdominoperineal resection or low anterior bowel resection . Patients , stratified by sex and surgical procedure , were r and omly assigned to either straight gravity drainage or a 6-day progressive catheter clamping program . The bladder training program following abdominoperineal resection reduced significantly the urinary dysfunction rate in women but not in men . Marked differences in dysfunction rates were related to type of surgery , gender , and surgeon . Etiology of postoperative voiding dysfunction in various groups is discussed
11,081
28,349,512
Clinical ly important improvement in global state was measured using the Clinical Global Impression ( CGI ) .
BACKGROUND Chlorpromazine , a widely available and inexpensive antipsychotic drug , is considered the benchmark treatment for schizophrenia worldwide . Metiapine , a dibenzothiazepine derivative , has been reported to have potent antipsychotic characteristics . However , no evidence currently exists on the effectiveness of chlorpromazine in treatment of people with schizophrenia compared to metiapine , a newer antipsychotic .
OBJECTIVE Despite the frequent use of the Positive and Negative Syndrome Scale ( PANSS ) for rating the symptoms of schizophrenia , the clinical meaning of its total score and of the cut-offs that are used to define treatment response ( e.g. at least 20 % or 50 % reduction of the baseline score ) are as yet unclear . We therefore compared the PANSS with simultaneous ratings of Clinical Global Impressions ( CGI ) . METHOD PANSS and CGI ratings at baseline ( n = 4091 ) , and after one , two , four and six weeks of treatment taken from a pooled data base of seven pivotal , multi-center antipsychotic drug trials on olanzapine or amisulpride in patients with exacerbations of schizophrenia were compared using equipercentile linking . RESULTS Being considered " mildly ill " according to the CGI approximately corresponded to a PANSS total score of 58 , " moderately ill " to a PANSS of 75 , " markedly ill " to a PANSS of 95 and severely ill to a PANSS of 116 . To be " minimally improved " according to the CGI score was associated with a mean percentage PANSS reduction of 19 % , 23 % , 26 % and 28 % at weeks 1 , 2 , 4 and 6 , respectively . The corresponding figures for a CGI rating " much improved " were 40 % , 45 % , 51 % and 53 % . CONCLUSIONS The results provide a better framework for underst and ing the clinical meaning of the PANSS total score in drug trials of schizophrenia patients with acute exacerbations . Such studies may ideally use at least a 50 % reduction from baseline cut-off to define response rather than lower thresholds . In treatment resistant population s , however , even a small improvement can be important , so that a 25 % cut-off might be appropriate The authors estimated components of variance and intraclass correlation coefficients ( ICCs ) to aid in the design of complex surveys and community intervention studies by analyzing data from the Health Survey for Engl and 1994 . This cross-sectional survey of English adults included data on a range of lifestyle risk factors and health outcomes . For the survey , households were sample d in 720 postal code sectors nested within 177 district health authorities and 14 regional health authorities . Study subjects were adults aged 16 years or more . ICCs and components of variance were estimated from a nested r and om-effects analysis of variance . Results are presented at the district health authority , postal code sector , and household levels . Between-cluster variation was evident at each level of clustering . In these data , ICCs were inversely related to cluster size , but design effects could be substantial when the cluster size was large . Most ICCs were below 0.01 at the district health authority level , and they were mostly below 0.05 at the postal code sector level . At the household level , many ICCs were in the range of 0.0 - 0.3 . These data may provide useful information for the design of epidemiologic studies in which the units sample d or allocated range in size from households to large administrative areas Sixty newly admitted acute schizophrenic patients were r and omly assigned to a double-blind trial of metiapine with a maximum dose of 450 mg per day versus a maximum dose of 450 mg per day versus a maximum daily dose of 900 mg chlorpromazine per day . At the conclusion of the study , 21 patients in each group showed marked to moderate improvement . There were significantly more marked improvers in the metiapine group then the chlorpromazine group on the Physician 's Posttreatment Global Impression . Evaluation by analysis of covariance of the Brief Psychiatric Rating Scale showed a significant difference between treatment groups favoring chlorpromazine on the item of blunted affect . The spectrum of side effects was similar in the two groups , except for six patients treated with metiapine who displayed tachycardia on the EKG . This pulse elevation was reflected in the group data and is probably dose related . In conclusion , both drugs appeared to be equally efficacious in the treatment of newly admitted acute schizophrenic patients OBJECTIVES Clotiapine is a classic neuroleptic with a chemical structure similar to clozapine . It was said that patients unresponsive to other neuroleptics respond to clotiapine although it causes extrapyramidal syndromes ( EPS ) like other typical neuroleptics . We conducted a study of clotiapine vs. chlorpromazine in severe chronic active psychotic hospitalized schizophrenia patients . METHODS The design was double-blind crossover of clotiapine vs chlorpromazine . No washout was necessary from previous neuroleptic treatment , and flexible overlap with the study medication was individualized for each patient . Patients were treated after reaching neuroleptic monotherapy for 3 months with clotiapine and 3 months with chlorpromazine , in r and om order . Medication was supplied in identical capsules of 100 mg chlorpromazine or 40 mg of clotiapine . Positive and Negative Syndrome Scale ( PANSS ) and Clinical Global Impression ( CGI ) were rated every 2 weeks and Nurse 's Observation Scale for Inpatient Evaluation ( NOSIE ) every month . RESULTS Fifty-eight patients were r and omized . Forty-three patients completed at least one phase of the study , and thirty-three completed both phases . Because of the small number of hostel patients and the very high dropout rate in the hostel patients , data analysis was done separately for in patients and hostel patients . Clotiapine was significantly superior to chlorpromazine in 26 in patients completing the crossover , on the PANSS , NOSIE and CGI . Clotiapine was also superior to chlorpromazine in an analysis of the parallel inpatient groups in the first three months before the crossover . CONCLUSION Some classic neuroleptic compounds may have superiority to chlorpromazine in a " clozapine-like " manner , despite a typical profile for EPS BACKGROUND A recent review suggested an association between using unpublished scales in clinical trials and finding significant results . AIMS To determine whether such an association existed in schizophrenia trials . METHOD Three hundred trials were r and omly selected from the Cochrane Schizophrenia Group 's Register . All comparisons between treatment groups and control groups using rating scales were identified . The publication status of each scale was determined and cl aims of a significant treatment effect were recorded . RESULTS Trials were more likely to report that a treatment was superior to control when an unpublished scale was used to make the comparison ( relative risk 1.37 ( 95 % CI 1.12 - 1.68 ) ) . This effect increased when a ' gold-st and ard ' definition of treatment superiority was applied ( RR 1.94 ( 95 % CI 1.35 - 2.79 ) ) . In non-pharmacological trials , one-third of ' gold-st and ard ' cl aims of treatment superiority would not have been made if published scales had been used . CONCLUSIONS Unpublished scales are a source of bias in schizophrenia trials In a non-blind assessment of 3 neuroleptic drugs , chlorpromazine ( Largactil ) , thioridazine ( Melleril ) and clotiapine ( Etomine ) , we found Etomine to be the drug of choice when the diagnosis is in doubt between a toxic psychosis or schizophrenia . This drug also offered the highest discharge rate , 77'7 % at 12 weeks compared with 73'5 % in the thioridazine group , and 55'5 % in the chlorpromazine group . No clouding of consciousness was seen in the clotiapine group , whereas it was troublesome in the chlorpromazine group in patients having received high parenteral doses To comprehend the results of a r and omised controlled trial ( RCT ) , readers must underst and its design , conduct , analysis , and interpretation . That goal can be achieved only through total transparency from authors . Despite several decades of educational efforts , the reporting of RCTs needs improvement . Investigators and editors developed the original CONSORT ( Consoli date d St and ards of Reporting Trials ) statement to help authors improve reporting by use of a checklist and flow diagram . The revised CONSORT statement presented here incorporates new evidence and addresses some criticisms of the original statement . The checklist items pertain to the content of the Title , Abstract , Introduction , Methods , Results , and Discussion . The revised checklist includes 22 items selected because empirical evidence indicates that not reporting this information is associated with biased estimates of treatment effect , or because the information is essential to judge the reliability or relevance of the findings . We intended the flow diagram to depict the passage of participants through an RCT . The revised flow diagram depicts information from four stages of a trial ( enrollment , intervention allocation , follow- up , and analysis ) . The diagram explicitly shows the number of participants , for each intervention group , included in the primary data analysis . Inclusion of these numbers allows the reader to judge whether the authors have done an intention- to-treat analysis . In sum , the CONSORT statement is intended to improve the reporting of an RCT , enabling readers to underst and a trial 's conduct and to assess the validity of its results
11,082
26,882,483
In all studies simulation training was equally or more beneficial than other instructions or no instructions .
PURPOSE The aim is to provide a complete overview of the different simulation-based training options for abdominal ultrasound and to explore the evidence of their effect .
Effective use of ultrasound requires an underst and ing of the physics , combined with the ability to interpret the sonographic images . The aim of our study was to evaluate the impact of a basic ultrasound curriculum using a phantom to train medical students . Twenty-eight first- to fourth-year medical students were r and omized to two groups : a control group that received no formal training and a trained group that received basic ultrasound training . Both groups took an initial multiple-choice written test and an ultrasound h and s-on test using an agarose-based tissue mimic containing various objects . The curriculum for the trained group consisted of reading the principles of ultrasound and a h and s-on session over the phantom . After training , both groups underwent a second multiple-choice exam and ultrasound practical test . The initial and the post-training test results were analyzed using a two-tailed Student 's t-test . Baseline written and practical test scores were similar for both groups . After training , written test scores improved ( 82 % trained vs. 66 % control , P < 0.001 ) . H and s-on ultrasound task performance also improved with training ( 96 % trained vs. 60 % control , P < 0.001 ) . The trained group took a shorter time to obtain a clear image and found on average one more object per scan . Parameters such as time to obtain a useful image and number of objects recognized also improved with training . Basic sonographic physics , imaging , and interpretation can be effectively taught to medical students during a short training session Endobronchial ultrasound-guided transbronchial needle aspiration ( EBUS-TBNA ) is very operator dependent and has a long learning curve . Simulation-based training might shorten the learning curve , and an assessment tool with solid validity evidence could ensure basic competency before unsupervised performance . A total of 16 respiratory physicians , without EBUS experience , were r and omised to either virtual-reality simulator training or traditional apprenticeship training on patients , and then each physician performed EBUS-TBNA procedures on three patients . Three blinded , independent assessor assessed the video recordings of the procedures using a newly developed EBUS assessment tool ( EBUSAT ) . The internal consistency was high ( Cronbach 's α=0.95 ) ; the generalisability coefficient was good ( 0.86 ) , and the tool had discriminatory ability ( p<0.001 ) . Procedures performed by simulator-trained novices were rated higher than procedures performed by apprenticeship-trained novices : mean±sd are 24.2±7.9 points and 20.2±9.4 points , respectively ; p=0.006 . A pass/fail st and ard of 28.9 points was established using the contrasting groups method , result ing in 16 ( 67 % ) and 20 ( 83 % ) procedures performed by simulator-trained novices and apprenticeship-trained novices failing the test , respectively ; p<0.001 . The endobronchial ultrasound assessment tool could be used to provide reliable and valid assessment of competence in EBUS-TBNA , and act as an aid in certification . Virtual-reality simulator training was shown to be more effective than traditional apprenticeship training . Virtual-reality simulation-based training shortens the learning curve for endobronchial ultrasound ( EBUS ) Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more BACKGROUND The need for surgeons to become proficient in performing and interpreting ultrasound examinations has been well recognized in recent years , but providing st and ardized training remains a significant challenge . The UltraSim ( MedSim , Ft . Lauderdale , Fla ) ultrasound simulator is a modified ultrasound machine that stores patient data in three-dimensional images . By scanning on the UltraSim mannequin , the student can reconstruct these images in real-time , eliminating the need for finding normal and abnormal models , while providing an objective method of both teaching and testing . The objective of this study was to compare the posttest results between residents trained on a real-time ultrasound simulator versus those trained in a traditional h and s-on patient format . We hypothesized that both methods of teaching would yield similar results as judged by performance on the interpretive portion of a st and ardized posttest . It is design ed as a prospect i ve , cohort study from two university trauma centers involving residents at the beginning of their first or second postgraduate year of training . The main outcome measure was performance on a st and ardized posttest , which included interpretation of ultrasound cases recorded on videotape . METHODS Students first took a written pretest to evaluate their baseline knowledge of ultrasound physics as well as their ability to interpret basic ultrasound images . The didactic portion of the course used the same teaching material s for all residents and included lectures on ultrasound physics , ultrasound use in trauma/critical care , and a series of instructional videos . This didactic session was followed by 1 hour for each student of h and s-on training on medical models/medical patients ( group I ) or by training on the ultrasound simulator ( group II ) . The pretest was repeated at the completion of the course ( posttest ) . Data were stratified by postgraduate year , i.e. , PG1 or PG2 . RESULTS A total of 74 residents were trained and tested in this study ( PG1 = 48 , PG2 = 26 ) . All residents showed significant improvement in their pretest and posttest scores ( p = 0.00 ) in both their knowledge of ultrasound physics and in their interpretation of ultrasound images . Importantly , we could not demonstrate any significant difference between groups trained on models/ patients ( group I ) versus those trained on the simulator ( group II ) when comparing their posttest interpretation of ultrasound images presented on videotapes ( PG1 , group I mean score 6.9 + /- 1.4 vs. PG1 , group II mean score 6.5 + /- 1.6 , p = 0.32 ; PG2 , group I mean score 7.7 + /- 1.4 vs. PG2 , group II mean score 7.9 + /- 1.2 , p = 0.70 ) . CONCLUSION The use of a simulator is a convenient and objective method of introducing ultrasound to surgery residents and compares favorably with the experience gained with traditional h and s-on patient models As medical education research advances , it is important that education research ers employ rigorous methods for conducting and reporting their investigations . In this article we discuss several important yet oft neglected issues in design ing experimental research in education . First , r and omization controls for only a subset of possible confounders . Second , the posttest-only design is inherently stronger than the pretest – posttest design , provided the study is r and omized and the sample is sufficiently large . Third , demonstrating the superiority of an educational intervention in comparison to no intervention does little to advance the art and science of education . Fourth , comparisons involving multifactorial interventions are hopelessly confounded , have limited application to new setting s , and do little to advance our underst and ing of education . Fifth , single-group pretest – posttest studies are susceptible to numerous validity threats . Finally , educational interventions ( including the comparison group ) must be described in detail sufficient to allow replication Purpose To compare pelvic ultrasound simulators ( PSs ) with live models ( LMs ) for training in transvaginal sonography ( TVS ) . Method The authors conducted a prospect i ve , r and omized controlled trial of 145 eligible medical students trained in TVS in 2011–2012 with either a PS or an LM . A patient educator was used for LM training . Simulated intrauterine and ectopic pregnancy models were used for PS training . Students were tested using a st and ardized patient who evaluated their professionalism . A proctor , blinded to training type , scored their scanning technique . Digital images were saved for blinded review . Students rated their training using a Likert scale ( 0 = not very well ; 10 = very well ) . The primary outcome measure was students ’ overall performance on a 40-point assessment tool for professionalism , scanning technique , and image acquisition . Poisson regression and Student t test were used for comparisons . Results A total of 134 students participated ( 62 trained using a PS ; 72 using an LM ) . Mean overall test scores were 56 % for the PS group and 69 % for the LM group ( P = .001 ) . A significant difference was identified in scanning technique ( PS , 60 % versus LM , 73 % ; P = .001 ) and image acquisition ( PS , 37 % versus LM , 59 % ; P = .001 ) . None was observed for professionalism . The PS group rated their training experience at 4.4 , whereas the LM group rated theirs at 6.2 ( P < .001 ) . Conclusions Simulators do not perform as well as LMs for training novices in TVS , but they may be useful as an adjunct to LM training OBJECTIVES Ultrasonography is of growing importance within internal medicine ( IM ) , but the optimal method of training doctors to use it is uncertain . In this study , the authors provide the first objective comparison of two approaches to training IM residents in ultrasonography . METHODS In this r and omised trial , a simulation-based ultrasound training curriculum was implemented during IM intern orientation at a tertiary care teaching hospital . All 72 incoming interns attended a lecture and were given access to online modules . Interns were then r and omly assigned to a 4-hour faculty-guided ( FG ) or self-guided ( SG ) ultrasound training session in a simulation laboratory with both human and manikin models . Interns were asked to self-assess their competence in ultrasonography and underwent an objective structured clinical examination ( OSCE ) to assess their competence in basic and procedurally oriented ultrasound tasks . The primary outcome was the score on the OSCE . RESULTS Faculty-guided training was superior to self-guided training based on the OSCE scores . Subjects in the FG training group achieved significantly higher OSCE scores on the two subsets of task completion ( 0.9-point difference , 95 % confidence interval [ CI ] 0.27 - 1.54 ; p = 0.008 ) and ultrasound image quality ( 2.43-point difference , 95 % CI 1.5 - 3.36 ; p < 0.001 ) . Both training groups demonstrated an increase in self-assessed competence after their respective training sessions and there was little difference between the groups . Subjects rated the FG training group much more favourably than the SG training group . CONCLUSIONS Both FG and SG ultrasound training curricula can improve the self-reported competence of IM interns in ultrasonography . However , FG training was superior to SG training in both skills acquisition and intern preference . Incorporating m and atory ultrasound training into IM residencies can address the perceived need for ultrasound training , improve confidence and procedural skills , and may enhance patient safety . However , the optimal training method may require significant faculty input The traditional apprenticeship model is being challenged in medical education . The “ see-one , do-one , teach-one ” approach does not live up to the dem and s regarding efficient training and patient safety . Ultrasound imaging is traditionally considered safe but the quality of the exam is very operator-dependent due to its dynamic nature 1 . Highly skilled operators are essential as false negative findings could lead to inadequate investigation and treatment , and false positive findings might result in unnecessary interventions 2 . This is true in particular because sonographic competence is a basic requirement for US as a dialog-based examination 3 . Most trainees participate in defined ultrasound courses but specialized ultrasound examination expertise is mainly acquired in daily clinical practice 4 5 . Fundamental problems are r and om occurrence of specific cases and also insufficient formalized supervision and feedback may occur due to scarce personnel re sources 6 . Ultrasound diagnostics quality assurance is a main concern for the European Federation of Societies for Ultrasound in Medicine and Biology ( EFSUMB ) 7 8 . However , the current training conditions are likely to impair the quality 9 . Virtual-reality ( VR ) simulation training has been m and atory in aviation for decades and is rapidly gaining importance in medical education . VR simulators have been developed for a multitude of different medical procedures , each consisting of a procedure-specific interface and a computer that generates images according to the movements of the user . A simulator has potential to deliver a safe , controlled and stress-free learning environment providing different performance scores as automatic feedback . Furthermore , st and ardized “ patient”-cases allow for a systematic assessment of competence and simulation-based mastery learning 10 . A systematic review and meta- analysis of more than 600 papers found large effects of VR simulation for outcomes of knowledge , skills , and behaviors , but very few of these papers related to ultrasound simulators 11 . Several VR ultrasound simulators have become commercially available over the last years 12 13 14 15 16 17 . A single system uses a haptic device to control the probe and the others use physical patient phantoms combined with electromagnetic tracking to track the position of the probe . The haptic device allows measuring the force that is applied but has a limited working range making probe movements less realistic . Electromagnetic tracking allow a greater range of motion on the patient phantom but the absence of haptics makes it possible to generate simulated images without actually touching the phantom . Furthermore , image deformation as a result of pressure applied to the “ patient ” is not simulated . The simulators use two different methods to generate images : interpolative or generative model-based . Interpolative simulation creates 2-dimensional ultrasound images from previously recorded 3-dimensional volumes acquired from real patients . It is relatively easy to acquire many different cases with very realistic images , but it is difficult to simulate view-dependent artifacts correctly . Generative model-based simulation uses manually generated computer models to create images , which makes detailed 4-dimensional simulation possible , e. g. echocardiography . However , the images are less realistic ( “ cartoon-like ” ) and building multiple cases is very time consuming 18 . No simulator will ever be 100 % realistic and which simulator(s ) to acquire depends largely on personal preferences and should be guided by thorough testing before purchase . Unfortunately , there is a paucity of high- quality scientific studies exploring the effects of the different VR ultrasound simulators . A recent systematic review only identified 14 articles and half of these were prospect i ve non-controlled studies 19 . Only five of 14 studies contained evaluation of a study population – two of these used r and omization . Another serious limitation concerning the available literature is that the vast majority of studies explored training-related skills acquisition in the same simulated environment , i. e. training on a simulator improves performance on that simulator . Caution has to be used before generalizing these results : Whether the improvement can be directly transferred into daily clinical practice is doubtful . There is a huge need for high- quality psychometric studies gathering validity evidence concerning the different simulators and for r and omized controlled trials ( RCTs ) exploring the effect of ultrasound simulator training . Performing RCTs in medical education poses several challenges 20 . A comparison between intervention and no intervention is meaningless as any kind of training is likely to be better than no training at all . Due to the limited number of physicians it can be difficult to identify a sizable , homogenous group of equally experienced trainees waiting to learn ultrasound diagnostics . Furthermore , special attention must be paid to the outcome parameters used to examine the effect of different training regimens . Reliable and valid assessment instruments should be used 21 22 and blinded , video-based assessment should be preferred over direct observation to reduce bias 23 . Overall , VR simulator training has a huge potential to improve training and education in ultrasound . Ultrasound simulators could be used in a highly st and ardized learning curriculum consisting of repetitive training on selected cases of increasing difficulty , including controlled exposure to rare cases . By providing automatic feedback the simulators can reduce the time that experts have to be present although they can never completely replace skilled supervisors . Finally , simulation-based tests including credible pass/fail st and ards may aid in the certification of future ultrasound operators 24 . However , it is important to realize that the existing literature does not justify the huge organizational and financial investments that are necessary to make simulator training and certification a m and atory part of the education of future ultrasound operators . Implementation should be preceded by research focusing on the quality of simulator feedback , the reliability and validity of simulation-based tests , and the transfer of acquired skills into clinical practice OBJECTIVES This study compared the effectiveness of a multimedia ultrasound ( US ) simulator to normal human models during the practical portion of a course design ed to teach the skills of both image acquisition and image interpretation for the Focused Assessment with Sonography for Trauma ( FAST ) exam . METHODS This was a prospect i ve , blinded , controlled education study using medical students as an US-naïve population . After a st and ardized didactic lecture on the FAST exam , trainees were separated into two groups to practice image acquisition on either a multimedia simulator or a normal human model . Four outcome measures were then assessed : image interpretation of prerecorded FAST exams , adequacy of image acquisition on a st and ardized normal patient , perceived confidence of image adequacy , and time to image acquisition . RESULTS Ninety-two students were enrolled and separated into two groups , a multimedia simulator group ( n = 44 ) , and a human model group ( n = 48 ) . Bonferroni adjustment factor determined the level of significance to be p = 0.0125 . There was no difference between those trained on the multimedia simulator and those trained on a human model in image interpretation ( median 80 of 100 points , interquartile range [ IQR ] 71 - 87 , vs. median 78 , IQR 62 - 86 ; p = 0.16 ) , image acquisition ( median 18 of 24 points , IQR 12 - 18 points , vs. median 16 , IQR 14 - 20 ; p = 0.95 ) , trainee 's confidence in obtaining images on a 1 - 10 visual analog scale ( median 5 , IQR 4.1 - 6.5 , vs. median 5 , IQR 3.7 - 6.0 ; p = 0.36 ) , or time to acquire images ( median 3.8 minutes , IQR 2.7 - 5.4 minutes , vs. median = 4.5 minutes , IQR = 3.4 - 5.9 minutes ; p = 0.044 ) . CONCLUSIONS There was no difference in teaching the skills of image acquisition and interpretation to novice FAST examiners using the multimedia simulator or normal human models . These data suggest that practical image acquisition skills learned during simulated training can be directly applied to human models RATIONALE AND OBJECTIVES With advancements in technology and push for health care reform and reduced costs , minimally invasive procedures , such as those that are ultrasound-guided , have become an essential part of radiology , and are used in many divisions of radiology . By incorporating st and ardized training method ologies in a risk free environment through utilization of a simulation center with phantom training , we hope to improve proficiency and confidence in procedural performance . MATERIAL S AND METHODS Twenty-nine radiology residents from four levels of training were enrolled in this prospect i ve study . The residents were given written , video , and live interactive training on the basics of ultrasound-guided procedures in our simulation center on a phantom mannequin . All of the teaching material s were created by residents and staff radiologists at the institution . RESULTS Residents demonstrated statistically significant improvement ( P < .05 ) between their pre- and posttest scores on both the written and practical examinations . They also showed a trend toward improved dexterity in the technical aspects of ultrasound-guided procedures ( P = .07 ) after training . On the survey question naire , residents confirm improved knowledge level , technical ability , and confidence levels pertaining to ultrasound-guided procedures . CONCLUSIONS The use of controlled simulation based training can be an invaluable tool to improve the knowledge level , dexterity , and confidence of residents performing ultrasound-guided procedures . Additionally , a simulation model allows st and ardization of education This study 's objective was to evaluate the peritoneal dialysis and mannequin simulator models for the h and s-on portion of a 4-h focused abdominal sonography for trauma ( FAST ) course . After an introductory lecture about trauma sonography and practice on normal models , trainees were assigned r and omly to two groups . They practice d FAST on one of the two simulator models . After the didactic and h and s-on portions of the seminar , FAST interpretation testing revealed mean scores of 82 % and 78 % for the peritoneal dialysis and mannequin simulator groups , respectively ( p = 0.95 ) . Post-course surveys demonstrated mean satisfaction scores for peritoneal dialysis and mannequin simulator models of 3.85 and 3.25 , respectively , on a 4-point Likert scale ( p = 0.317 ) . A FAST educational seminar , which provides both didactic and h and s-on instruction , can be completed in 4 h ; the h and s-on instruction phase can incorporate both normal models and abnormal simulation models , such as the peritoneal dialysis model and the multimedia mannequin simulator
11,083
28,510,645
Conclusion nurse-led and nurse-collaborative interventions moderately improved adherence among discharged older adults .
Background discharged older adult in patients are often prescribed numerous medications . However , they only take about half of their medications and many stop treatments entirely . Nurse interventions could improve medication adherence among this population . Objective to conduct a systematic review of trials that assessed the effects of nursing interventions to improve medication adherence among discharged , home-dwelling and older adults .
Objective Heart failure patients are regularly admitted to hospital and frequently use multiple medication . Besides intentional changes in pharmacotherapy , unintentional changes may occur during hospitalisation . The aim of this study was to investigate the effect of a clinical pharmacist discharge service on medication discrepancies and prescription errors in patients with heart failure . Setting A general teaching hospital in Tilburg , the Netherl and s. Method An open r and omized intervention study was performed comparing an intervention group , with a control group receiving regular care by doctors and nurses . The clinical pharmacist discharge service consisted of review of discharge medication , communicating prescribing errors with the cardiologist , giving patients information , preparation of a written overview of the discharge medication and communication to both the community pharmacist and the general practitioner about this medication . Within 6 weeks after discharge all patients were routinely scheduled to visit the outpatient clinic and medication discrepancies were measured . Main outcome measure The primary endpoint was the frequency of prescription errors in the discharge medication and medication discrepancies after discharge combined . Results Forty-four patients were included in the control group and 41 in the intervention group . Sixty-eight percent of patients in the control group had at least one discrepancy or prescription error against 39 % in the intervention group ( RR 0.57 ( 95 % CI 0.37–0.88 ) ) . The percentage of medications with a discrepancy or prescription error in the control group was 14.6 % and in the intervention group it was 6.1 % ( RR 0.42 ( 95 % CI 0.27–0.66 ) ) . Conclusion This clinical pharmacist discharge service significantly reduces the risk of discrepancies and prescription errors in medication of patients with heart failure in the 1st month after discharge Abstract Background : Increased life expectancy is associated with an increased prevalence of chronic diseases and drug consumption . Changes often occur in the medication regimen after hospitalization . The extent and nature of these changes and the adherence of elderly patients have not yet been fully investigated . Objective : To investigate the extent and reasons for modifications to the medication regimens of elderly patients and their adherence to treatment during the first month following hospital discharge . Methods : This was a prospect i ve cohort study of 198 patients aged ≥65 years in the Acute Geriatric Ward , Beilinson Hospital , Rabin Medical Center , Israel . Clinical , demographic and medication regimen data were recorded for all patients at an interview conducted prior to discharge . After 1 month , the patient , caregiver or general practitioner ( GP ) were interviewed regarding the extent and reasons for modifications to the medication regimen and adherence to treatment . Results : At 1-month post-discharge , on average , 36.7 % of patient medications had been modified compared with the discharge prescription . No modification was found in 16 % of patients . During the observation month , 62 % of prescribed long-term medications were taken without modification as recommended at discharge and during follow-up , 50 % of all changes were characterized by the addition of a drug or an increase in dosage , and 26 % , 16 % and 8 % consisted of cancelling , omission or switching within the same medication type , respectively . Seventy percent of medication regimen changes were based on specialists ’ recommendations or secondary to a change in the patients ’ medical state , and 13 % , 8 % , 3 % and 6 % were as a result of poor adherence , adverse effects , administrative restrictions and other reasons , respectively . There was no correlation between medication regimen change and age , gender , physical function , cognitive function and length of hospital stay . Patients discharged home experienced less regimen modification than those discharged elsewhere ( p = 0.02 ) . Patients who visited their GP only once experienced less regimen modification ( p = 0.03 ) . Regression analysis showed that the only factors affecting medication regimen changes were GP visits and chronic diseases ( p < 0.01 , R2 = 0.09 ) . The overall mean adherence among 145 home-dwelling patients was 96.7 % . Twenty-seven percent and 6 % were under- and over-adherent , respectively , to at least one drug ; under-adherence was more widespread than over-adherence . No correlation was found between the overall mean adherence and other clinical parameters or regimen change . However , non-adherence to at least one drug was associated with more medication regimen changes ( p = 0.001 ) , was more common in patients discharged with prescriptions for seven or more drug types per day ( p = 0.01 ) and was associated with failing to visit the patient ’s GP 1 month after discharge ( p = 0.02 ) . Conclusion : The majority of elderly patients experienced modifications in their medication regimen during the first month following hospital discharge . Thirty percent of patients were non-adherent to at least one drug . To improve adherence to a hospital medication regimen , patients should be encouraged to visit their GP and the number of long-term drugs should be reduced OBJECTIVE To examine the reliability and validity of the Medication Adherence Individual Review -Screening Tool ( MedAdhIR-ST ) for assessing medication adherence in a community-dwelling elderly population . DESIGN A prospect i ve , observational pilot study comparing the reliability and validity of the MedAdhIR-ST and the Medication Adherence Question naire ( MAQ ) . SETTING Independent senior-housing apartments and senior centers in Wake County , North Carolina . PARTICIPANTS Eligible subjects included individuals 60 years of age or older who were living in the community and managing their own medication regimens . INTERVENTIONS Each subject was asked to participate in two assessment visits , two weeks ( + /- 3 days ) apart , in which the questions of the MedAdhIR-ST and MAQ were administered . MAIN OUTCOME MEASURE Medication adherence . RESULTS Both tools showed moderate-to-high test/retest reliability in the study population ( correlation coefficient of 0.632 for MAQ , and 0.699 for MedAdhIR-ST ) , and moderate internal consistency ( Cronbach 's a of 0.551 and 0.584 , respectively ) . Moderate concordance in the ability to assess adherence was observed between MedAdhIR-ST and MAQ ( positive correlation coefficient of 0.450 ) . When compared with refill records , MedAdhIR-ST was slightly more sensitive ( 67 % vs. 43 % ) and specific ( 60 % vs. 50 % ) for detecting adherence and nonadherence , respectively , compared with MAQ . Exploratory factor analysis indicated that MedAdhIR-ST is multidimensional . CONCLUSION MedAdhIR-ST appears to be a reliable and valid tool for screening nonadherence in a community-dwelling elderly population BACKGROUND Despite the availability of proven therapies , outcomes in patients with heart failure ( HF ) remain poor . In this 2-stage , multicenter trial , we evaluated the effect of a disease management program on clinical and economic outcomes in patients with HF . METHODS AND RESULTS In Stage 1 , a pharmacist or nurse assessed each patient and made recommendations to the physician to add or adjust angiotensin-converting enzyme ( ACE ) inhibitors and other HF medications . Before discharge ( Stage 2 ) , patients were r and omized to a patient support program ( PSP ) ( education about HF , self-monitoring , adherence aids , newsletters , telephone hotline , and follow-up at 2 weeks , then monthly for 6 months after discharge ) or usual care . In Stage 1 ( 766 patients ) ACE inhibitor use increased from 58 % on admission to 83 % at discharge ( P < .0001 ) , and the daily dose ( in enalapril equivalents ) increased from 11.3 + /- 8.8 mg to 14.5 + /- 8.8 mg ( P < .0001 ) . In Stage 2 ( 276 patients ) there was no difference in ACE inhibitor adherence , but a reduction in cardiovascular-related emergency room visits ( 49 versus 20 , P = .030 ) , hospitalization days ( 812 versus 341 , P = .003 ) , and cost of care ( 2,531 Canadian dollars less per patient ) in favor of the PSP . CONCLUSION Simple interventions can improve ACE inhibitor use and patient outcomes Abstract Objectives . Many hospital admissions are due to inappropriate medical treatment , and discharge of fragile elderly patients involves a high risk of readmission . The present study aim ed to assess whether a follow-up programme undertaken by GPs and district nurses could improve the quality of the medical treatment and reduce the risk of readmission of elderly newly discharged patients . Design and setting . The patients were r and omized to either an intervention group receiving a structured home visit by the GP and the district nurse one week after discharge followed by two contacts after three and eight weeks , or to a control group receiving the usual care . Patients . A total of 331 patients aged 78 + years discharged from Glostrup Hospital , Denmark , were included . Main outcome measures . Readmission rate within 26 weeks after discharge among all r and omized patients . Control of medication , evaluated 12 weeks after discharge on 293 ( 89 % ) of the patients by an interview at home and by a question naire to the GP . Results . Control-group patients were more likely to be readmitted than intervention-group patients ( 52 % v 40 % ; p = 0.03 ) . In the intervention group , the proportions of patients who used prescribed medication of which the GP was unaware ( 48 % vs. 34 % ; p = 0.02 ) and who did not take the medication prescribed by the GP ( 39 % vs. 28 % ; p = 0.05 ) were smaller than in the control group . Conclusion . The intervention shows a possible framework securing the follow-up on elderly patients after discharge by reducing the readmission risk and improving medication control Lowering blood pressure ( BP ) in stroke survivors reduces the risk of recurrent stroke . We tested the hypothesis that a nurse-led nonpharmacologic intervention would lower the BP of participants in an intervention group compared with a control group . A total of 349 patients who had sustained acute stroke or transient ischemic attack were r and omly assigned to either usual care or to 4 home visits by a nurse . During the visits , the nurse measured and recorded BP and provided individually tailored counseling on a healthy lifestyle . A total of 303 patients completed the 1-year follow up . No change in systolic BP was noted in either the intervention group or the control group . Because of an increase in diastolic BP in the control group ( P = .03 ) , a difference in mean diastolic BP between the 2 groups was found at follow-up ( P = .007 ) . Mean BP at follow-up was 139/82 mm Hg in the intervention group and 142/86 mm Hg in the control group . Linear regression analysis demonstrated that BP at the point of discharge was the strongest predictor of BP 1 year later ( P < .0001 ) . The proportion of patients on antihypertensive medication increased in the intervention group ( P = .002 ) . Patients were compliant with antihypertensive therapy , and 92 % of the hypertensive patients in the intervention group followed the advice to see a general practitioner ( GP ) for BP checkups . At follow-up , 187 patients ( 62 % ) were hypertensive , with no difference in the rate of hypertension seen between the groups . Our data indicate that home visits by nurses did not result in a lowering of BP . Patients complied with antihypertensive therapy and GP visits in the case of hypertension . Nonetheless , the majority of patients were hypertensive at the 1-year follow up UNLABELLED An integrated care intervention including education , coordination among levels of care , and improved accessibility , reduced hospital readmissions in chronic obstructive pulmonary disease ( COPD ) after 1 year . This study analyses the effectiveness of this intervention in terms of clinical and functional status , quality of life , lifestyle , and self-management , under the hypothesis that changes in these factors could explain the observed reduction in readmissions . A total of 113 exacerbated COPD patients ( 14 % female , mean ( SD ) age 73(8 ) years , FEV(1 ) 1.2(0.5 ) l ) were recruited after hospital discharge in Barcelona , Spain , and r and omly assigned ( 1:2 ) to integrated care ( IC ) ( n=44 ) or usual care ( UC ) ( n=69 ) . The intervention consisted of an individually tailored care plan at discharge shared with the primary care team and access to a specialized case manager nurse through a web-based call centre . After 1 year of intervention , subjects in the intervention group improved body mass index by 1.34 kg/m(2 ) . Additionally , they scored better in self-management items : COPD knowledge 81 % vs. 44 % , exacerbation identification 85 % vs. 22 % , exacerbation early treatment 90 % vs. 66 % , inhaler adherence 71 vs. 37 % , and inhaler correctness 86 vs. 24 % . There were no differences in the evolution of dyspnea , lung function , quality of life scores , lifestyle factors , or medical treatment . CONCLUSIONS This IC trial improved disease knowledge , and treatment adherence , after 1 year of intervention , suggesting that these factors may play a role in the prevention of severe COPD exacerbations triggering hospital admissions OBJECTIVE : To compare medication adherence calculated from four different data sources including a pill count and self-report obtained during a home medication history , as well as calculations based on refill frequency derived from a provincial prescription cl aims data base ( manual and electronic ) . DESIGN : Baseline medication adherence was collected as part of a prospect i ve , r and omized , controlled study . Mean medication adherence results obtained from the four data sources were compared using repeated- measures ANOVA followed by a Tukey 's multiple range test . SETTING : A pharmacy consultation service located at an interdisciplinary wellness center for noninstitutionalized elderly . PATIENTS : 65 years or older , noninstitutionalized , taking one or more prescribed or nonprescribed medications . Clients would either present to the wellness center or be referred by the Provincial Home Care program . RESULTS : When calculated from self-report or manual or electronic prescription cl aims data , mean percent adherence by drug was high and not statistically different ( 95.8 % ± 17.1 % , 107.6 % ± 40.3 % , and 94.6 % ± 24.0 % , respectively ) , whereas the pill count adherence was significantly lower at 74.0 % ± 41.5 % ( p < 0.0001 ) . CONCLUSIONS : An unexpected finding was that the pill count technique used in this study of elderly clients using chronic , repeat medications appeared to underestimate medication adherence . Numerous other limitations of pill count , self-report , and a province-wide prescription cl aims data base in estimating medication adherence are presented . When using medication adherence as a process measure , the research er and practitioner should be aware of the limitations unique to the data source they choose , and interpret data cautiously Abstract The Center for Adherence Support Evaluation ( CASE ) Adherence Index , a simple composite measure of self-reported antiretroviral therapy ( ART ) adherence , was compared to a st and ard three-day self-reported adherence measure among participants in a longitudinal , prospect i ve cross-site evaluation of 12 adherence programs throughout the United States . The CASE Adherence Index , consisting of three unique adherence questions developed for the cross-site study , along with a three-day adherence self-report were administered by interviews every three months over a one-year period . Data from the three cross-site adherence questions ( individually and in combination ) were compared to three -day self-report data and HIV RNA and CD4 outcomes in cross-sectional analyses . The CASE Adherence Index correlated strongly with the three-day self-reported adherence data ( p<0.001 ) and was more strongly associated with HIV outcomes , including a 1-log decline in HIV RNA level ( maximum OR = 2.34 ; p<0.05 ) , HIV RNA < 400 copies/ml ( maximum OR = 2.33 ; p<0.05 ) and performed as well as the three-day self-report when predicting CD4 count status . Participants with a CASE Index score > 10 achieved a 98 cell mean increase in CD4 count over 12 months , compared to a 41 cell increase for those with scores ≤10 ( p<0.05 ) . The CASE Adherence Index is an easy to administer instrument that provides an alternative method for assessing ART adherence in clinical setting Background Many patients delay or interrupt dual antiplatelet therapy ( DAT ) after drug-eluting stent ( DES ) implantation , which increases the risk of stent thrombosis and death . Objective To test the hypothesis that simple telephone contact made by nurses would improve adherence to and persistence of DAT . Design R and omised controlled trial . Patients and intervention A total of 300 patients ( mean±SD 64±10 years , 73 % male ) were recruited immediately after DES implantation performed between June 2009 and June 2010 . The last patient recruited reached the 1-year follow-up time point in June 2011 . Patients were r and omised to one of two groups : intervention , with four telephone follow-ups , versus a control group . In the intervention group , phone calls were made within 7 days of the DES implantation and at 1 , 6 and 9 months to support drug adherence . Control patients were followed as per usual clinical practice . Pharmacy data were collected to assess drug prescription filling and refill . Setting Tertiary care university cardiovascular centre and community . Main outcome measures The primary end point was the proportion of days covered with aspirin and clopidogrel over the year after discharge as assessed by pharmacy refill data . Secondary outcome measures included persistence of aspirin and clopidogrel treatment , defined as no gaps longer than 14 days during follow-up . Results Most patients ( 73 % ) underwent DES implantation in the context of an acute coronary syndrome . All patients had drug insurance cover , either from the public plan ( 59 % ) or through private plans ( 41 % ) . Complete pharmacy follow-up data were available for 96 % of the cohort . At 12 months , median scores ( 25th–75th centile ) for adherence to aspirin and clopidogrel were 99.2 % ( 97.5–100 % ) and 99.3 % ( 97.5–100 % ) , respectively , in the intervention group compared with 90.2 % ( 84.2–95.4 % ) and 91.5 % ( 85.1–96.0 % ) , respectively , in the control group ( p<0.0001 for aspirin and clopidogrel ) . Patients in the intervention group were significantly more persistent in the aspirin and clopidogrel treatment than those in the control group . For clopidogrel , 87.2 % of patients in the intervention group were still persistent at 12 months compared with only 43.1 % in the control group ( p<0.0001 ) . Conclusions A simple approach of four telephone calls to patients after DES implantation significantly improved 1-year drug adherence to near-perfect scores . Persistence of DAT was also significantly improved by the intervention Abstract Introduction Congestive heart failure ( CHF ) , which typically affects older people , is characterized by high short- and mid-term mortality rates . However , despite accumulating evidence showing that administration of β-blockers ( β-adrenoceptor antagonists ) can improve the clinical status of CHF patients , use of these agents in adequate dosages in this setting is not routine . One reason for this appears to be a concern about a possible risk of bradyarrhythmia associated with use of β-blockers . Telecardiology has recently been investigated as a means of constantly monitoring the heart rate of CHF patients in their homes . Its use may allay concerns about the risk of bradyarrhythmia and facilitate a more widespread use of β-blockers in this context . Objectives The primary objectives of this study were to assess the impact of telemonitoring on patients ’ adherence to prescribed therapeutic regimens , particularly β-blockers , and to explore whether use of home telemonitoring reduces mortality and rate of re-admission to hospital in elderly CHF patients compared with normal specialized CHF team care . Methods A total of 57 patients with CHF ( 31 New York Heart Association [ NYHA ] class II , 23 NYHA class III and 3 NYHA class IV ) , with a mean ±SD age of 78.2±7.3 years , were r and omized to a control group who received st and ard care , based on routinely scheduled clinic visits , from a team specialized in CHF patient management , or to a home telemonitoring group ( TM group ) , managed by the same specialized CHF team . Patients were followed up over 12 months . Results Compared with the control group , the TM group had a significant increase in the use of β-blockers , HMG-CoA reductase inhibitors ( statins ) and aldosterone receptor antagonists . A reduction in nitrate administration compared with baseline was also seen in the TM group . The 12-month occurrence of the primary combined endpoint of mortality and hospital re-admission for CHF was significantly lower in the TM group than in the control group ( p < 0.01 ) . Conclusions This study showed that a home-care model including telemonitoring of relevant clinical parameters may provide useful support in the management of patients with CHF . Home telemonitoring in CHF patients was associated with increased use of β-blockers at appropriate doses , suggesting that this strategy reassured physicians regarding the safety of careful use of these agents in this setting . However , larger studies are required to confirm these findings . Our findings indicate that there is a need to investigate relevant parameters in CHF patients at the point of care ( i.e. in patients ’ daily lives ) , which can in turn optimize β-blocker and other drug therapy This pilot study examined the impact of a hospital transition intervention for older adults ( ≥ 65 years of age ) with heart failure ( HF ) to promote medication use self-management . Forty subjects , hospitalized with either primary or secondary HF , had a mean age of 76.9 ± 6.5 years ; 65 % were males . The majority of subjects ( 55 % ) had NYHA Class III HF . A prospect i ve , repeated measures experimental design was used . Baseline and follow-up data ( 1- and 3-months after hospitalization ) were obtained using the Medication Regimen Complexity Index , Brief Medication Question naire , Drug Regiment Unassisted Grading Scale , and Kansas City Cardiomyopathy Question naire . Using repeated measures analysis of covariance ( ANCOVA ) , with baseline measures as covariates , the transition intervention group had higher levels of medication adherence ( F(1,35 ) = 13.4 , p < .001 ) , self-efficacy for HF self-care ( F(1,35 ) = 17.9 , p < .001 ) and had significantly fewer HF related symptoms that impaired health related quality of life ( F(1,35 ) = 9.1 , p = .006 ) PURPOSE The objectives of this investigation were to prospect ively assess medication compliance rates in elderly patients with congestive heart failure , to identify factors associated with reduced compliance , and to evaluate the effect of a multidisciplinary treatment approach on medication adherence . PATIENTS AND METHODS A total of 156 patients > or = 70 years of age ( mean , 79.4 + /- 6.0 ; 67 % female , 65 % nonwhite ) hospitalized with congestive heart failure were evaluated prospect ively . Prior to discharge , patients were r and omized to the study intervention ( n = 80 ) or conventional care ( n = 76 ) . The intervention consisted of comprehensive patient education , dietary and social service consultations , medication review , and intensive postdischarge follow-up . Detailed data were collected on all prescribed medications at the time of discharge , and compliance was assessed by pill counts 30 + /- 2 days later . RESULTS The overall compliance rate during the first 30 days after discharge was 84.6 + /- 15.1 % ( range , 23.1 - 100 % ) . Compliance was 87.9 + /- 12.0 % in patients r and omized to the study intervention , compared with 81.1 + /- 17.2 % in the control group ( P = 0.003 ) . A compliance rate of > or = 80 % was achieved by 85.0 % of the treatment group versus 69.7 % of the control group ( P = 0.036 ) . By multivariate analysis , assignment to the treatment group was the strongest independent predictor of compliance ( P = 0.008 ) . Other variables included in the model were Caucasian race ( P = 0.044 ) and not living alone ( P = 0.09 ) . CONCLUSIONS A multidisciplinary treatment strategy is associated with improved medication compliance during the first 30 days following hospital discharge in elderly patients with congestive heart failure . Improved compliance may contribute to improved outcomes in these patients AIM To test the effects of a postdischarge transitional care programme among patients with coronary heart disease . BACKGROUND . Coronary heart disease is a leading cause of death worldwide . Effective postdischarge care can help patients maintain a healthy lifestyle and thereby control the risk factors . Transitional care is under-developed in mainl and China . DESIGN A r and omised controlled trial . METHOD The control group ( n = 100 ) received routine care and the study group ( n = 100 ) received the postdischarge transitional care programme , which consisted of predischarge assessment , structured home visits and telephone follow-ups within four weeks after discharge . Subjects were recruited in 2002 - 2003 , with data collected at baseline before discharge , two days and four and 12 weeks after discharge . RESULTS Participants in the study group had significantly better underst and ing in diet , medications and health-related lifestyle behaviour at day 2 and in weeks 4 and 12 and better underst and ing in exercise at weeks 4 and 12 . There were significant differences between the control and study groups in diet and health-related lifestyle at day 2 and weeks 4 and 12 , in medication at weeks 4 and 12 and exercise at week 12 . There was no difference in hospital readmission between the two groups . The study group was very satisfied with the care . There was no difference in willingness to pay for nurse follow-up services between groups . CONCLUSION This study is an original effort to establish and test a nurse-led transitional care model in China . Results demonstrate that transitional care is effective in mainl and China , concurring with studies done elsewhere . RELEVANCE TO CLINICAL PRACTICE This study has constructed a transitional care model for patients with coronary heart disease in the context of the Chinese population which is effective in enhancing healthy lifestyle among these patients
11,084
29,620,566
Conclusions : There is a high variability of osmolarity measurements with the TearLab system . A substantial number of healthy subjects fulfill the DEWS 's definition of DED .
Purpose : To assess the variability of osmolarity measured by the point-of-care TearLab system in healthy eyes .
Abstract Purpose : To determine the tear osmolarity in patients with tearing secondary to dry eye and other pathologies , and to determine the prevalence of dry eye disease among patients with tearing in an oculoplastics setting . Methods : 108 eyes of 54 patients with a chief complaint of tearing were prospect ively recruited . Subjects were excluded if they used eye drops or contact lenses within 2 hours of assessment , had a history of refractive surgery , an active ocular allergy , or evidence of a systemic disease which affects tear production . A full medical and ocular history was taken with a complete eye exam pertinent to dry eye . Tear osmolarity was measured using the TearLab device . A clinical diagnosis of dry eye was made based on findings , without reference to tear osmolarity . Results : Among 86 eyes symptomatic for tearing , 32 eyes had dry eye disease ( 37 % ) . Patients with dry eye had a significantly higher median tear osmolarity compared to that in patients with other diagnoses ( 308 mOsm/L vs. 294 mOsm/L , p < 0.0001 ) . At a cut-off of 308 mOsm/L , tear osmolarity result ed in a sensitivity of 50 % and a specificity of 88 % for the diagnosis of dry eye . Conclusions : A significant proportion of patients with tearing in an oculoplastics practice had dry eye disease . The high specificity of tear osmolarity may render it a useful tool to rule in dry eye disease and may assist the oculoplastic surgeon in more accurately determining the cause of tearing PURPOSE To evaluate the use of tear osmolarity in the diagnosis of dry eye disease . DESIGN A prospect i ve , observational case series to determine the clinical usefulness of tear osmolarity and commonly used objective tests to diagnose dry eye disease . METHODS A multicenter , 10-site study consisting of 314 consecutive subjects between 18 and 82 years of age . Bilateral tear osmolarity , tear film break-up time ( TBUT ) , corneal staining , conjunctival staining , Schirmer test , and meibomian gl and grading were performed . Diagnostic performance was measured against a composite index of objective measurements that classified subjects as having normal , mild or moderate , or severe dry eye . The main outcome measures were sensitivity , specificity , area under the receiver operating characteristic curve , and intereye variability . RESULTS Of the 6 tests , tear osmolarity was found to have superior diagnostic performance . The most sensitive threshold between normal and mild or moderate subjects was found to be 308 mOsms/L , whereas the most specific was found at 315 mOsms/L. At a cutoff of 312 mOsms/L , tear hyperosmolarity exhibited 73 % sensitivity and 92 % specificity . By contrast , the other common tests exhibited either poor sensitivity ( corneal staining , 54 % ; conjunctival staining , 60 % ; meibomian gl and grading , 61 % ) or poor specificity ( tear film break-up time , 45 % ; Schirmer test , 51 % ) . Tear osmolarity also had the highest area under the receiver operating characteristic curve ( 0.89 ) . Intereye differences in osmolarity were found to correlate with increasing disease severity ( r(2 ) = 0.32 ) . CONCLUSIONS Tear osmolarity is the best single metric both to diagnose and classify dry eye disease . Intereye variability is a characteristic of dry eye not seen in normal subjects PURPOSE A prospect i ve , multisite clinical study ( 10 sites in the European Union and the United States ) evaluated the clinical utility of commonly used tests and tear osmolarity for assessing dry eye disease severity . METHODS Three hundred fourteen consecutive subjects between the ages of 18 and 82 years were recruited from the general patient population , 299 of which qualified with complete data sets . Osmolarity testing , Schirmer test without anesthesia , tear film breakup time ( TBUT ) , corneal staining , meibomian dysfunction assessment , and conjunctival staining were performed bilaterally . A symptom question naire , the Ocular Surface Disease Index ( OSDI ) , was also administered to each patient . Distributions of clinical signs and symptoms against a continuous composite severity index were evaluated . RESULTS Osmolarity was found to have the highest correlation coefficient to disease severity ( r(2 ) = 0.55 ) , followed by conjunctival staining ( r(2 ) = 0.47 ) , corneal staining ( r(2 ) = 0.43 ) , OSDI ( r(2 ) = 0.41 ) , meibomian score ( r(2 ) = 0.37 ) , TBUT ( r(2 ) = 0.30 ) , and Schirmer result ( r(2 ) = 0.17 ) . A comparison of st and ard threshold-based classification with the composite severity index revealed significant overlap between the disease severities of prospect ively defined normal and dry eye groups . Fully 63 % of the subjects were found to be poorly classified by combinations of clinical thresholds . CONCLUSIONS Tear film osmolarity was found to be the single best marker of disease severity across normal , mild/moderate , and severe categories . Other tests were found to be informative in the more severe forms of disease ; thus , clinical judgment remains an important element in the clinical assessment of dry eye severity . The results also indicate that the initiation and progression of dry eye is multifactorial and supports the rationale for redefining severity on the basis of a continuum of clinical signs . ( Clinical Trials.gov number , NCT00848198 . ) Purpose : To evaluate the tear osmolarity and ocular surface changes in patients with polycystic ovary syndrome ( PCOS ) . Material s and Methods : Forty-eight patients with recently diagnosed PCOS and thirty-three control volunteers were enrolled in this prospect i ve , observational study . Ocular surface disease index ( OSDI ) score was calculated . Tear osmolarity was measured using the TearLab Osmolarity System ( Tearlab , San Diego , CA , USA ) . All subjects also underwent the following ophthalmologic evaluation : Schirmer I test , tear-film breakup time ( TBUT ) , ocular surface flourescein staining , and conjunctival impression cytology . Results : Mean OSDI score was significantly higher in patients with PCOS than control subjects ( P = 0.001 ) . Tear osmolarity was similar in both groups ( P = 0.404 ) . There were no significant differences between groups in Schirmer I test results , TBUT , and ocular surface flourescein staining scores ( P > 0.05 ) . Compared to control group , a statistically significant squamous metaplasia was observed in temporal bulbar conjunctival impression cytology specimens in PCOS group ( P = 0.032 ) . Conclusions : In patients with recently diagnosed PCOS , tear volume and osmolarity are not affected but , conjunctival morphology may be affected , though on a limited scale UNLABELLED It has been suggested that tear fluid is isotonic with plasma , and plasma osmolality ( P(osm ) ) is an accepted , albeit invasive , hydration marker . Our aim was to determine whether tear fluid osmolarity ( T(osm ) ) assessed using a new , portable , noninvasive , rapid collection and measurement device tracks hydration . PURPOSE This study aim ed to compare changes in T(osm ) and another widely used noninvasive marker , urine specific gravity ( USG ) , with changes in P(osm ) during hypertonic-hypovolemia . METHODS In a r and omized order , 14 healthy volunteers exercised in the heat on one occasion with fluid restriction ( FR ) until 1 % , 2 % , and 3 % body mass loss ( BML ) and with overnight fluid restriction until 08:00 h the following day , and on another occasion with fluid intake ( FI ) . Volunteers were rehydrated between 08:00 and 11:00 h. T(osm ) was assessed using the TearLab osmolarity system . RESULTS P(osm ) and USG increased with progressive dehydration on FR ( P < 0.001 ) . T(osm ) increased significantly on FR from 293 ± 9 to 305 ± 13 mOsm·L(-1 ) at 3 % BML and remained elevated overnight ( 304 ± 14 mOsm·L(-1 ) ; P < 0.001 ) . P(osm ) and T(osm ) decreased during exercise on FI and returned to preexercise values the following morning . Rehydration restored P(osm ) , USG , and T(osm ) to within preexercise values . The mean correlation between T(osm ) and P(osm ) was r = 0.93 and that between USG and P(osm ) was r = 0.72 . CONCLUSIONS T(osm ) increased with dehydration and tracked alterations in P(osm ) with comparable utility to USG . Measuring T(osm ) using the TearLab osmolarity system may offer sports medicine practitioners , clinicians , and research investigators a practical and rapid hydration assessment technique Abstract Aim : To generate data on the variability of tear osmolarity in a control ( normal , non-dry eye ) and symptomatic dry eye population ( Ocular Surface Disease Index : OSDI ≥20 ) . A secondary outcome is the determination of the effect that tear collection technique has on the osmolarity of the sample . Material s and methods : This was a two-phase study that recruited 20 subjects ( n = 10 normal , n = 10 dry eye ) to evaluate the influence of time between measurements ( Phase I ) and 30 subjects ( n = 15 normal , n = 15 dry eye ) to evaluate the influence of collection technique ( Phase II ) . As part of Phase I , serial tear osmolarity measurements were performed on each eye ; four separated by 15 min followed by four separated by 1 min , at each of three visits . Phase II compared the consecutive measurement of four in vivo tear sample s to four in vitro measurements on tears collected and dispensed from a glass capillary tube . Results : During Phase I , the dry eye group had a significantly higher maximum osmolarity ( 334.2 ± 25.6 mOsm/L ) compared to the normal group ( 304.0 ± 8.4 mOsm/L , p = 0.002 ) . No significant differences were observed whether collection s were performed at 15 or 1 min intervals . During Phase II , the in vivo osmolarity was equivalent to in vitro measurements from glass capillary tube sample s for both the dry eye group ( 323.0 ± 16.7 mOsm/L versus 317.7 ± 24.8 , p = 0.496 ) , and for the normal subjects ( 301.2 ± 7.2 mOsm/L versus 301.9 ± 16.0 mOsm/L , p = 0.884 ) . Conclusion : Symptomatic dry eye subjects exhibited a significantly higher tear osmolarity and variation over time than observed in normal subjects , reflecting the inherent tear film instability of dry eye disease . There was no change in the distribution of tear osmolarity measurements whether tears were collected in rapid succession or given time to equilibrate , and collection method had no impact on tear osmolarity PURPOSE To independently assess the measurement variability of TearLab System in a clinical setting of one visit and to estimate the minimum number of measurements required for reliable readings of tear osmolarity . METHODS Ten consecutive osmolarity measurements were taken from both eyes by the same examiner at one visit for fourteen subjects . The ocular surface disease index symptoms question naire and tear film break up time were also performed . Group average cumulative mean and cumulative coefficient of variation were calculated to assess the TearLab measurement variation . Repeated application of Thompson 's tau method was performed to identify the outliers in tear osmolarity readings for each eye . Results from both eyes were analysed separately . RESULTS Up to two r and omly occurring outlying values in 10 consecutive measurements were found in 19 out of 28 measured eyes . No statistically significant differences between the left and right eye were found for the group mean and group st and ard deviation ( paired t-test , p=0.099 and p=0.068 , respectively ) , however the cumulative coefficient of variation indicated higher measurement group variability on one eye . Estimated cumulative coefficient of variation indicated the minimum of three consecutive acquisitions required for the measurement to be reliable . CONCLUSIONS TearLab Osmolarity System required at least three consecutive measurements to be taken in order to provide clinical ly reliable tear osmolarity readings . Also , taking the maximum osmolarity value for detecting dry eye disease should be viewed with caution since outlying readings of tear osmolarity frequently occur Purpose : Tear film hyperosmolarity is recognized as an important pathogenetic factor in dry eye syndrome , but difficulties in its measurement have limited its utility in the recent past . This prospect i ve , nonr and omized , clinical single-center study investigates the osmolarity in tear sample s of patients with keratoconjunctivitis sicca compared with healthy controls . Methods : One hundred thirty-three patients [ aged 58 years ( 51–64 years ) , 86 women and 47 men ] with moderate to severe keratoconjunctivitis sicca and 95 controls [ aged 52 years ( 48–61 years ) , 55 women and 40 men ] were enrolled in the trial . Tear sample s were collected directly from the inferior lateral tear meniscus . Inclusion criteria were a tear breakup time of less than 5 seconds , a Schirmer test with anesthesia less than 5 mm , and positive symptoms ( Ocular Surface Disease Index score > 83 ) . Tear film osmolarity was analyzed by the TearLab osmometer . Results : In our study , patients with moderate to severe keratoconjunctivitis sicca showed a tear film osmolarity of 320 mOsmol/L ( 301–324 mOsmol/L ) . The results of the control group were 301 mOsmol/L ( 298–304 mOsmol/L ) . Our results revealed a significantly higher tear film osmolarity in patients with moderate to severe keratoconjunctivitis sicca compared with the control group . The sensitivity was 87 % , and the specificity was 81 % . Conclusions : Our results approved the referent value in moderate to severe dry eye of approximately 316 mOsmol/L , as described in the literature . The results showed a significantly higher tear film osmolarity in patients with severe keratoconjunctivitis sicca compared with the healthy controls . Testing tear film osmolarity can be a very effective objective diagnostic tool in the diagnosis of dry eye disease Purpose To compare tear film osmolarity measurements between in situ and vapor pressure osmometers . Repeatability of in situ measurements and the effect of sample collection techniques on tear film osmolarity were also evaluated . Methods Osmolarity was measured in one r and omly determined eye of 52 healthy participants using the in situ ( TearLab Corporation , San Diego , CA ) and the vapor pressure ( Vapro 5520 ; Wescor , Inc. , Logan , UT ) osmometers . In a subset of 20 participants , tear osmolarity was measured twice on-eye with the in situ osmometer and was additionally determined on a sample of nonstimulated collected tears ( 3 & mgr;L ) with both instruments . Results Mean ( SD ) tear film osmolarity with the in situ osmometer was 299.2 ( 10.3 ) mOsmol/L compared with 298.4 ( 10 ) mmol/kg with the vapor pressure osmometer , which correlated moderately ( r = 0.5 , P < 0.05 ) . Limits of agreement between the two instruments were −19.7 to + 20.5 mOsmol/L. Using collected tears , measurements with the vapor pressure osmometer were marginally higher ( mean [ SD ] , 303.0 [ 11.0 ] vs 299.3 [ 8.0 ] mOsmol/L ; P > 0.05 ) but correlated well with those using the in situ osmometer ( r = 0.9 , P < 0.05 ) . The mean ( SD ) osmolarity of on-eye tears was 5.0 ( 6.6 ) mOsmol/L higher than that of collected tears , when both measurements were conducted with the in situ osmometer . This was a consistent effect because the measurements correlated well ( r = 0.65 , P < 0.05).The in situ osmometer showed good repeatability with a coefficient of repeatability of 9.4 mOsmol/L ( r = 0.8 , P < 0.05 ) . Conclusions Correlation between the two instruments was better when compared on collected tear sample s. Tear film osmolarity measurement is influenced by the sample collection technique with the osmolarity of on-eye tears being higher than that of collected tears . This highlights the importance of measuring tear film osmolarity directly on-eye . The in situ osmometer has good repeatability for conducting this measurement PURPOSE : To evaluate the effects of laser in situ keratomileusis ( LASIK ) and laser‐assisted subepithelial keratectomy ( LASEK ) on dry‐eye disease markers including tear osmolarity , Schirmer testing , and the ocular surface disease index ( OSDI ) . SETTING : Laser Suite , Mater Private Hospital , Dublin , Irel and . DESIGN : Prospect i ve controlled cross‐sectional observation study . METHODS : In a single center , consecutive eyes that had LASIK or LASEK had dry‐eye disease markers assessed preoperatively and 3 , 6 , and 12 months postoperatively . RESULTS : In LASIK eyes ( n = 50 ) , the mean tear osmolarity was significantly elevated ( by 2.8 % ) at 12 months ( P=.009 , Student t test ) . The mean Schirmer test values were not significantly altered postoperatively in either group . The mean OSDI was maximum in LASIK eyes at 3 months ( mean follow‐up 7.2 months ± 8.2 [ SD ] ) and in LASEK eyes ( n = 35 ) preoperatively ( mean follow‐up 9.1 ± 9.7 months ) . CONCLUSIONS : There were no significant differences in dry‐eye disease markers or tear osmolarity between LASIK and LASEK patients at any stage after surgery up to 1 year . Although tear osmolarity remains one of the highest predictive tests of dry eye , it is complementary with other diagnostic criteria in the context of dry eye after refractive surgery . Financial Disclosure : No author has a financial or proprietary interest in any material or method mentioned Purpose To assess the interchangeability of tear osmolarity measurements between electrical impedance and freezing-point depression osmometers and to analyze inter-eye tear osmolarity variability measured with these osmometers in healthy subjects . Methods Tear osmolarity was measured using the TearLab osmometer ( OcuSense Inc. , San Diego , CA ) and the Fiske 210 micro sample osmometer ( Advanced Instruments Inc. , Norwood , MA ) . We r and omly selected one eye in 50 subjects ( 29 women , 21 men ; mean age , 33.16 ± 6.11 years ) to analyze whether osmolarity measurements by these osmometers were interchangeable . Both eyes of 25 patients ( 15 women , 10 men ; mean age , 34.32 ± 6.37 years ) were included to analyze inter-eye osmolarity variability . Results The mean tear osmolarity values measured with the TearLab osmometer were higher ( 305.22 ± 16.06 mOsm/L ) than those with the Fiske 210 osmometer ( 293.40 ± 12.22 mOsm/L ) , with the intraclass correlation coefficient being 0.23 ( p = 0.051 ) . A Bl and -Altman plot showed that the systems were not interchangeable because there was a systematic difference , with the limits of agreement being −17.93 to 41.57 mOsm/L. There were no statistically significant differences ( p = 0.5006 and p = 0.6533 , respectively ) between an individual ’s eyes measured with either osmometer . Conclusions Because the TearLab tear osmolarity measurements were higher than those of the Fiske 210 measurements and the limits of agreement were too wide , the two osmolarity values can not be used interchangeably . In healthy subjects , there is no difference in tear osmolarity between right and left eyes of the same individual measured with both instruments Abstract Purpose : Since acromegaly is a disease with various systemic complications , it may also have ophthalmologic consequences . The aim of the current study was to compare the tear osmolarity and tear function changes in patients with acromegaly with those in healthy controls . Design : Prospect i ve , cross-sectional study . Material s and methods : Fifty-nine consecutive patients with acromegaly and 62 age and gender matched healthy volunteers were enrolled in the study . Tear osmolarity measurement with TearLab Osmolarity System ( Tearlab , San Diego , CA ) , tear film break-up time ( TBUT ) assessment , and the Schirmer test without anesthesia were performed in the same order in each group . Growth hormone ( GH ) and insulin like growth factor-1 ( IGF1 ) levels were also determined in the study group . Results : The mean TBUT was lower in acromegalic patients ( 9.1 ± 3.6 seconds ) than in healthy controls ( 10.7 ± 2.9 s ) ( p = 0.009 ) . The difference between the two groups in tear osmolarity and Schirmer test results ( p = 0.08 and p = 0.9 , respectively ) was not statistically different . Conclusions : Acromegaly may a cause a decrease in TBUT in the affected patient . Preservation of normal tear osmolarity and normal Schirmer test results suggests that this might be due to effects on the meibomian gl and
11,085
29,445,272
There was marked heterogeneity between studies secondary to large variety of disease severity within commonly included cohorts and differences in CT acquisition parameters . Conclusion CT density shows a good relationship to clinical ly relevant parameters ; however , study heterogeneity and lack of longitudinal data mean that it is difficult to compare studies or derive a minimal clinical ly important difference .
Background The aim of the study was to assess the relationship between computed tomography ( CT ) densitometry and routine clinical markers in patients with chronic obstructive pulmonary disease ( COPD ) and alpha-1 anti-trypsin deficiency ( AATD ) .
Background Computed tomography ( CT ) lung densitometry has been demonstrated to be the most sensitive and specific outcome measure for the assessment of emphysema-modifying therapy , but the optimum densitometric index has yet to be determined and targeted sampling may be more sensitive than whole lung assessment . The EXAcerbations and CT scan as Lung Endpoints ( EXACTLE ) trial aim ed to clarify the optimum approach to the use of CT densitometry data for the assessment of alpha 1-antitrypsin ( AAT ) augmentation therapy on the progression of emphysema in AAT deficiency ( AATD ) . Methods Patients with AATD ( n = 77 ) were r and omised to weekly infusions of 60 mg/kg human AAT ( Prolastin ® ) or placebo over 2 to 2.5 years . Lung volume was included as a covariate in an endpoint analysis and a comparison was made of different CT densitometric indices ( 15th percentile lung density [ PD15 ] , mean lung density [ MLD ] and voxel index at a threshold of -910 [ VI-910 ] and -950 [ VI-950 ] Hounsfield Units ) obtained from whole lung scans at baseline and at 24 to 30 months . Targeted regional sampling was compared with whole lung assessment . Results Whole lung analysis of the total change ( baseline to last CT scan ) compared with placebo indicated a concordant trend that was suggestive of a treatment effect for all densitometric indices ( MLD [ 1.402 g/L , p = 0.204 ] ; VI-910 [ -0.611 , p = 0.389 ] ; VI-950 [ -0.432 , p = 0.452 ] ) and that was significant using PD15 ( 1.472 g/L , p = 0.049 ) . Assessment of the progression of emphysema in the apical , middle and basal regions of the lung by measurement with PD15 showed that this treatment effect was more evident when the basal third was sample d ( 1.722 g/L , p = 0.040 ) . A comparison between different densitometric indices indicated that the influence of inspiratory variability between scans was greatest for PD15 , but when adjustment for lung volume was made this index was the most sensitive measure of emphysema progression . Conclusion PD15 is the most sensitive index of emphysema progression and of treatment modification . Targeted sampling may be more sensitive than whole lung analysis .Trial registration Registered in Clinical Trials.gov as ' Antitrypsin ( AAT ) to Treat Emphysema in AAT-Deficient Patients ' ; Clinical Trials.gov Identifier : NCT00263887 Abstract The Dutch-Belgian R and omized Lung Cancer Screening Trial ( Dutch acronym : NELSON study ) was design ed to investigate whether screening for lung cancer by low-dose multidetector computed tomography ( CT ) in high-risk subjects will lead to a decrease in 10-year lung cancer mortality of at least 25 % compared with a control group without screening . Since the start of the NELSON study in 2003 , 7557 participants underwent CT screening , with scan rounds in years 1 , 2 , 4 and 6 . In the current review , the design of the NELSON study including participant selection and the lung nodule management protocol , as well as results on validation of CT screening and first results on lung cancer screening are described PURPOSE To prospect ively evaluate airway wall thickness and lung attenuation at spirometrically gated thin-section computed tomography ( CT ) in patients with chronic obstructive pulmonary disease ( COPD ) and to correlate gated CT findings with pulmonary function test ( PFT ) results . MATERIAL S AND METHODS The ethical committee approved the study , and all patients gave informed consent . Forty-two consecutive patients with COPD ( 20 with and 22 without chronic bronchitis [ CB ] ) underwent gated thin-section CT and PFTs on the same day . The percentage wall area ( PWA ) and the thickness-to-diameter ratio ( TDR ) for all depicted bronchi that were round and larger than 2 mm in diameter , the mean lung attenuation ( MLA ) , and the pixel index ( PI ) at -950 HU were determined . The reproducibility of the airway measurements was preliminarily tested by performing a five-trial examination in a patient with COPD and in a control patient . Differences in airway and lung attenuation measurements between the patients with and those without CB were evaluated at Mann-Whitney U testing . Simple and multiple regression analyses were used to assess the correlation between thin-section CT and PFT measurements . RESULTS The mean intraoperator coefficient of variation for airway measurements was 7.8 % ( range , 3.8%-13.4 % ) . An average of nine bronchi per patient were assessed . Patients with CB had significantly higher PWAs , TDRs , and MLAs and significantly lower PIs than patients without CB ( P < .05 for all values ) . The combination of PWA , TDR , and PWA normalized to body weight correlated significantly ( P < .05 ) with the forced expiratory volume in 1 second-to-slow vital capacity ratio and the diffusing capacity of the lung for carbon monoxide in patients with but not in patients without CB . PFT results correlated better with MLA and PI in patients without CB . CONCLUSION Bronchial wall measurements differ between patients who have COPD with CB and those who have COPD without CB . The correlation between airway dimensions and indexes of airway obstruction in patients with COPD and CB indicates that the bronchial tree is the site of anatomic-functional alterations in this patient group Background The Korean Obstructive Lung Disease ( KOLD ) Cohort Study is a prospect i ve longitudinal study of patients with chronic obstructive pulmonary disease ( COPD ) , asthma , or other unclassified obstructive lung diseases . It was design ed to develop new classification models and biomarkers that predict clinical ly relevant outcomes for patients with obstructive lung diseases . Methods Patients over 18 years old who have chronic respiratory symptoms and airflow limitations or bronchial hyper-responsiveness were enrolled at 17 centers in South Korea . After a baseline visit , the subjects were followed up every 3 months for various assessment s. Results From June 2005 to October 2013 , a total of 477 subjects ( 433 [ 91 % ] males ; 381 [ 80 % ] diagnosed with COPD ) were enrolled . Analyses of the KOLD Cohort Study identified distinct phenotypes in patients with COPD , and predictors of therapeutic responses and exacerbations as well as the factors related to pulmonary hypertension in COPD . In addition , several genotypes were associated with radiological phenotypes and therapeutic responses among Korean COPD patients . Conclusion The KOLD Cohort Study is one of the leading long-term prospect i ve longitudinal studies investigating heterogeneity of the COPD and is expected to provide new insights for pathogenesis and the long-term progression of COPD Background Emphysema on CT is common in older smokers . We hypothesised that emphysema on CT predicts acute episodes of care for chronic lower respiratory disease among older smokers . Material s and Methods Participants in a lung cancer screening study age ≥60 years were recruited into a prospect i ve cohort study in 2001–02 . Two radiologists independently visually assessed the severity of emphysema as absent , mild , moderate or severe . Percent emphysema was defined as the proportion of voxels ≤ −910 Hounsfield Units . Participants completed a median of 5 visits over a median of 6 years of follow-up . The primary outcome was hospitalization , emergency room or urgent office visit for chronic lower respiratory disease . Spirometry was performed following ATS/ERS guidelines . Airflow obstruction was defined as FEV1/FVC ratio < 0.70 and FEV1<80 % predicted . Results Of 521 participants , 4 % had moderate or severe emphysema , which was associated with acute episodes of care ( rate ratio 1.89 ; 95 % CI : 1.01–3.52 ) adjusting for age , sex and race/ethnicity , as was percent emphysema , with similar associations for hospitalisation . Emphysema on visual assessment also predicted incident airflow obstruction ( HR 5.14 ; 95 % CI 2.19–21.1 ) . Conclusion Visually assessed emphysema and percent emphysema on CT predicted acute episodes of care for chronic lower respiratory disease , with the former predicting incident airflow obstruction among older smokers Background Diary cards are useful for analyzing exacerbations in chronic obstructive pulmonary disease ( COPD ) , although factors influencing the length and frequency of each episode are poorly understood . This study investigated factors that influence the features of exacerbations in patients with alpha-1 antitrypsin ( AAT ) deficiency ( PiZ phenotype ) and COPD . Methods Daily diary cards were collected over 2 years . Patients had emphysema visualized and quantified by computed tomography scan , and had at least one documented exacerbation in the previous year . Results The patients ( n = 23 ) had a mean age of 52.5 years , forced expiratory volume in one second ( FEV1 ) of 1.2 L ( 38.4 % predicted ) , corrected gas transfer ( KCO ) of 0.90 mmol/min/kPa/L ( 59.7 % predicted ) , and 15th percentile lung density of 44.55 g/L. Two hundred and sixty-three exacerbations ( 164 treated ) were identified . The frequency of treated exacerbations correlated negatively with KCO% predicted ( r = −0.432 ; P = 0.022 ) . Exacerbation length ( determined for 17 of the patients for whom diary card data through the episode were available ) correlated negatively with baseline 15th percentile lung density ( r = −0.361 ; P = 0.003 ) , and increased the longer treatment was delayed ( r = 0.503 ; P < 0.001 ) . Treatment delay was shorter with higher day 1 symptom score , lower baseline FEV1 , FEV1/forced vital capacity , and lower 15th percentile lung density ( r = −0.368 , 0.272 , 0.461 , and 0.786 ; P = 0.004 , 0.036 , < 0.001 , and < 0.001 , respectively ) . Time to resolution of exacerbation after treatment initiation was not affected by treatment delay , but correlated negatively with KCO% predicted ( r = −0.647 ; P = 0.007 ) . Conclusion In alpha-1 antitrypsin deficiency , the frequency and length of resolution of exacerbation were related to baseline gas transfer . Treatment delay adversely affected exacerbation length , and lung density was the best independent predictor of delay in starting treatment Chronic obstructive pulmonary disease ( COPD ) is a heterogeneous disease and responses to therapies are highly variable . The aim of this study was to identify the predictors of pulmonary function response to 3 months of treatment with salmeterol/fluticasone in patients with COPD . A total of 127 patients with stable COPD from the Korean Obstructive Lung Disease ( KOLD ) Cohort , which were prospect ively recruited from June 2005 to September 2009 , were analyzed retrospectively . The prediction models for the FEV1 , FVC and IC/TLC changes after 3 months of treatment with salmeterol/fluticasone were constructed by using multiple , stepwise , linear regression analysis . The prediction model for the FEV1 change after 3 months of treatment included wheezing history , pre-bronchodilator FEV1 , post-bronchodilator FEV1 change and emphysema extent on CT ( R = 0.578 ) . The prediction models for the FVC change after 3 months of treatment included pre-bronchodilator FVC , post-bronchodilator FVC change ( R = 0.533 ) , and those of IC/ TLC change after 3 months of treatment did pre-bronchodilator IC/TLC and post-bronchodilator FEV1 change ( R = 0.401 ) . Wheezing history , pre-bronchodilator pulmonary function , bronchodilator responsiveness , and emphysema extent may be used for predicting the pulmonary function response to 3 months of treatment with salmeterol/fluticasone in patients with COPD BACKGROUND Retinoids promote alveolar septation in the developing lung and stimulate alveolar repair in some animal models of emphysema . METHODS One hundred forty-eight subjects with moderate-to-severe COPD and a primary component of emphysema , defined by diffusing capacity of the lung for carbon monoxide ( Dlco ) [ 37.1 + /- 12.0 % of predicted ] and CT density mask ( 38.5 + /- 12.8 % of voxels < - 910 Hounsfield units ) [ mean + /- SD ] were enrolled into a r and omized , double-blind , feasibility study at five university hospitals . Participants received all-trans retinoic acid ( ATRA ) at either a low dose ( LD ) [ 1 mg/kg/d ] or high dose ( HD ) [ 2 mg/kg/d ] , 13-cis retinoic acid ( 13-cRA ) [ 1 mg/kg/d ] , or placebo for 6 months followed by a 3-month crossover period . RESULTS No treatment was associated with an overall improvement in pulmonary function , CT density mask score , or health-related quality of life ( QOL ) at the end of 6 months . However , time-dependent changes in Dlco ( initial decrease with delayed recovery ) and St. George Respiratory Question naire ( delayed improvement ) were observed in the HD-ATRA cohort and correlated with plasma drug levels . In addition , 5 of 25 participants in the HD-ATRA group had delayed improvements in their CT scores that also related to ATRA levels . Retinoid-related side effects were common but generally mild . CONCLUSIONS No definitive clinical benefits related to the administration of retinoids were observed in this feasibility study . However , time- and dose-dependent changes in Dlco , CT density mask score , and health-related QOL were observed in subjects treated with ATRA , suggesting the possibility of exposure-related biological activity that warrants further investigation PURPOSE Limited data exist describing risk factors for mortality in patients having predominantly emphysema . SUBJECTS AND METHODS A total of 609 patients with severe emphysema ( ages 40 - 83 yr ; 64.2 % male ) r and omized to the medical therapy arm of the National Emphysema Treatment Trial formed the study group . Cox proportional hazards regression analysis was used to investigate risk factors for all-cause mortality . Risk factors examined included demographics , body mass index , physiologic data , quality of life , dyspnea , oxygen utilization , hemoglobin , smoking history , quantitative emphysema markers on computed tomography , and a modification of a recently described multifunctional index ( modified BODE ) . RESULTS Overall , high mortality was seen in this cohort ( 12.7 deaths per 100 person-years ; 292 total deaths ) . In multivariate analyses , increasing age ( p=0.001 ) , oxygen utilization ( p=0.04 ) , lower total lung capacity % predicted ( p=0.05 ) , higher residual volume % predicted ( p=0.04 ) , lower maximal cardiopulmonary exercise testing workload ( p=0.002 ) , greater proportion of emphysema in the lower lung zone versus the upper lung zone ( p=0.005 ) , and lower upper-to-lower-lung perfusion ratio ( p=0.007 ) , and modified BODE ( p=0.02 ) were predictive of mortality . FEV1 was a significant predictor of mortality in univariate analysis ( p=0.005 ) , but not in multivariate analysis ( p=0.21 ) . CONCLUSION Although patients with advanced emphysema experience significant mortality , subgroups based on age , oxygen utilization , physiologic measures , exercise capacity , and emphysema distribution identify those at increased risk of death We have investigated whether restoration of the balance between neutrophil elastase and its inhibitor , alpha(1)-antitrypsin , can prevent the progression of pulmonary emphysema in patients with alpha(1)-antitrypsin deficiency . Twenty-six Danish and 30 Dutch ex-smokers with alpha(1)-antitrypsin deficiency of PI*ZZ phenotype and moderate emphysema ( FEV(1 ) between 30 % and 80 % of predicted ) participated in a double-blind trial of alpha(1)-antitrypsin augmentation therapy . The patients were r and omized to either alpha(1)-antitrypsin ( 250 mg/kg ) or albumin ( 625 mg/kg ) infusions at 4-wk intervals for at least 3 yr . Self-administered spirometry performed every morning and evening at home showed no significant difference in decline of FEV(1 ) between treatment and placebo . Each year , the degree of emphysema was quantified by the 15th percentile point of the lung density histogram derived from computed tomography ( CT ) . The loss of lung tissue measured by CT ( mean + /- SEM ) was 2.6 + /- 0.41 g/L/yr for placebo as compared with 1.5 + /- 0.41 g/L/yr for alpha(1)-antitrypsin infusion ( p = 0.07 ) . Power analysis showed that this protective effect would be significant in a similar trial with 130 patients . This is in contrast to calculations based on annual decline of FEV(1 ) showing that 550 patients would be needed to show a 50 % reduction of annual decline . We conclude that lung density measurements by CT may facilitate future r and omized clinical trials of investigational drugs for a disease in which little progress in therapy has been made in the past 30 yr Assessment of emphysema-modifying therapy is difficult , but newer outcome measures offer advantages over traditional methods . The EXAcerbations and Computed Tomography scan as Lung End-points ( EXACTLE ) trial explored the use of computed tomography ( CT ) densitometry and exacerbations for the assessment of the therapeutic effect of augmentation therapy in subjects with α1-antitrypsin ( α1-AT ) deficiency . In total , 77 subjects ( protease inhibitor type Z ) were r and omised to weekly infusions of 60 mg·kg−1 human α1-AT ( Prolastin ® ) or placebo for 2–2.5 yrs . The primary end-point was change in CT lung density , and an exploratory approach was adopted to identify optimal methodology , including two methods of adjustment for lung volume variability and two statistical approaches . Other end-points were exacerbations , health status and physiological indices . CT was more sensitive than other measures of emphysema progression , and the changes in CT and forced expiratory volume in 1 s were correlated . All methods of densitometric analysis concordantly showed a trend suggestive of treatment benefit ( p-values for Prolastin ® versus placebo ranged 0.049–0.084 ) . Exacerbation frequency was unaltered by treatment , but a reduction in exacerbation severity was observed . In patients with α1-AT deficiency , CT is a more sensitive outcome measure of emphysema-modifying therapy than physiology and health status , and demonstrates a trend of treatment benefit from α1-AT augmentation BACKGROUND The efficacy of α1 proteinase inhibitor ( A1PI ) augmentation treatment for α1 antitrypsin deficiency has not been substantiated by a r and omised , placebo-controlled trial . CT-measured lung density is a more sensitive measure of disease progression in α1 antitrypsin deficiency emphysema than spirometry is , so we aim ed to assess the efficacy of augmentation treatment with this measure . METHODS The RAPID study was a multicentre , double-blind , r and omised , parallel-group , placebo-controlled trial of A1PI treatment in patients with α1 antitrypsin deficiency . We recruited eligible non-smokers ( aged 18 - 65 years ) in 28 international study centres in 13 countries if they had severe α1 antitrypsin deficiency ( serum concentration < 11 μM ) with a forced expiratory volume in 1 s of 35 - 70 % ( predicted ) . We excluded patients if they had undergone , or were on the waiting list to undergo , lung transplantation , lobectomy , or lung volume-reduction surgery , or had selective IgA deficiency . We r and omly assigned patients ( 1:1 ; done by Accovion ) using a computerised pseudor and om number generator ( block size of four ) with centre stratification to receive A1PI intravenously 60 mg/kg per week or placebo for 24 months . All patients and study investigators ( including those assessing outcomes ) were unaware of treatment allocation throughout the study . Primary endpoints were CT lung density at total lung capacity ( TLC ) and functional residual capacity ( FRC ) combined , and the two separately , at 0 , 3 , 12 , 21 , and 24 months , analysed by modified intention to treat ( patients needed at least one evaluable lung density measurement ) . This study is registered with Clinical Trials.gov , number NCT00261833 . A 2-year open-label extension study was also completed ( NCT00670007 ) . FINDINGS Between March 1 , 2006 , and Nov 3 , 2010 , we r and omly allocated 93 ( 52 % ) patients A1PI and 87 ( 48 % ) placebo , analysing 92 in the A1PI group and 85 in the placebo group . The annual rate of lung density loss at TLC and FRC combined did not differ between groups ( A1PI -1·50 g/L per year [ SE 0·22 ] ; placebo -2·12 g/L per year [ 0·24 ] ; difference 0·62 g/L per year [ 95 % CI -0·02 to 1·26 ] , p=0·06 ) . However , the annual rate of lung density loss at TLC alone was significantly less in patients in the A1PI group ( -1·45 g/L per year [ SE 0·23 ] ) than in the placebo group ( -2·19 g/L per year [ 0·25 ] ; difference 0·74 g/L per year [ 95 % CI 0·06 - 1·42 ] , p=0·03 ) , but was not at FRC alone ( A1PI -1·54 g/L per year [ 0·24 ] ; placebo -2·02 g/L per year [ 0·26 ] ; difference 0·48 g/L per year [ -0·22 to 1·18 ] , p=0·18 ) . Treatment-emergent adverse events were similar between groups , with 1298 occurring in 92 ( 99 % ) patients in the A1PI group and 1068 occuring in 86 ( 99 % ) in the placebo group . 71 severe treatment-emergent adverse events occurred in 25 ( 27 % ) patients in the A1PI group and 58 occurred in 27 ( 31 % ) in the placebo group . One treatment-emergent adverse event leading to withdrawal from the study occurred in one patient ( 1 % ) in the A1PI group and ten occurred in four ( 5 % ) in the placebo group . One death occurred in the A1PI group ( respiratory failure ) and three occurred in the placebo group ( sepsis , pneumonia , and metastatic breast cancer ) . INTERPRETATION Measurement of lung density with CT at TLC alone provides evidence that purified A1PI augmentation slows progression of emphysema , a finding that could not be substantiated by lung density measurement at FRC alone or by the two measurements combined . These findings should prompt consideration of augmentation treatment to preserve lung parenchyma in individuals with emphysema secondary to severe α1 antitrypsin deficiency . FUNDING CSL Behring Purpose High-resolution computed tomography ( CT ) is a vali date d method to quantify the extent of pulmonary emphysema . In this study , we assessed the reliability of low-dose volumetric CT ( LDCT ) for the quantification of emphysema and its correlation with spirometric indices of airway obstruction . Material s and Methods The study population consisted of 102 consecutive current and former smokers participating in a lung cancer screening trial . All subjects underwent spirometry testing and LDCT at entry and a LDCT after 12 months . The extent of emphysema was estimated by 2 techniques ; by using the lung attenuation threshold analysis and by visual assessment of the 2 independent radiologists . The reproducibility of these determinations was assessed using test-retest reliability and κ coefficient of agreement . The correlation of LDCT-based emphysema determinations with indices of airway obstruction on spirometry was also calculated . Results Eighty percent of the participants were male , with a mean ( st and ard deviation ) age of 54.5 ( 7.5 ) years , and median pack-years ( interquartile range ) of 20 ( 24 ) . Test-retest reliability of all LDCT-based emphysema determinations was very good ( intraclass correlation coefficient of 0.92 for the volume of emphysema , and 0.93 for the emphysema index or emphysema volume/total lung volume ) . Similarly , there was an excellent interrater agreement for visual assessment of emphysema ( κ coefficient=0.91 ) . Higher volumes of emphysema measured quantitatively or visually significantly correlated with spirometric markers of airway obstruction . Conclusions Volumetric LDCT is a reliable and valid technique for the quantification of emphysema in asymptomatic smokers This study was performed to assess density resolution in quantitative computed tomography ( CT ) of foam and lung . Density resolution , a measure for the ability to discriminate material s of different density in a CT number histogram , is normally determined by quantum noise . In a cellular solid , variations in mass in the volumes sample d by CT cause an additional degradation of density resolution by the linear partial volume effect . The sample volume , which is directly related to spatial resolution , can be varied by choosing different section thicknesses and reconstruction filters . Several polyethene ( PE ) foams , as simple models of lung tissue , and five patients were investigated using various sample volumes . For the uniform PE foams , density resolution could be directly determined as the full width at half maximum of CT number histograms . Density resolution for foams with cell sizes of 0.8 - 1.5 mm was dominated by effects caused by the limited sample size , not by quantum noise . The relative magnitudes of density resolution could roughly be explained with a model for a hypothetic r and om cellular solid . Since lungs are not of uniform density , analysis of patient data was more complicated . A combined convolution least-squares fit procedure , together with information obtained in the studies of foam , were used to determine density resolution in lung studies . Density resolution , both for foams and lung , was strongly dependent on sample volume , and was quite poor for thin sections and sharp filters . Consequently , histogram-shape related parameters are sensitive to the spatial resolution chosen on CT . Thin section densitometry , using a 1-mm section with a st and ard or high resolution filter , is not recommended except in determining average density . When using thicker sections , an in-plane spatial resolution similar to section thickness is advised RATIONALE Chronic obstructive pulmonary disease ( COPD ) is a complex and heterogeneous disorder in which a number of different pathological processes lead to recognition of patient subgroups that may have individual characteristics and distinct responses to treatment . OBJECTIVES We tested the hypothesis that responses of lung function to 3 months of combined inhalation of long-acting beta-agonist and corticosteroid might differ among patients with various COPD subtypes . METHODS We classified 165 COPD patients into four subtypes according to the severity of emphysema and airflow obstruction : emphysema-dominant , obstruction-dominant , mild-mixed , and severe-mixed . The emphysema-dominant subtype was defined by an emphysema index on computed tomography of more than 20 % and FEV(1 ) more than 45 % of the predicted value . The obstruction-dominant subtype had an emphysema index < or = 20 % and FEV(1 ) < or = 45 % , the mild-mixed subtype had an emphysema index < or = 20 % and FEV(1 ) > 45 % , and the severe-mixed subtype had an emphysema index > 20 % and FEV(1 ) < or = 45 % . Patients were recruited prospect ively and treated with 3 months of combined inhalation of long-acting beta-agonist and corticosteroid . RESULTS After 3 months of combined inhalation of long-acting beta-agonist and corticosteroid , obstruction-dominant subtype patients showed a greater FEV(1 ) increase and more marked dyspnea improvement than did the emphysema-dominant subgroup . The mixed-subtype patients ( both subgroups ) also showed significant improvement in FEV(1 ) compared with the emphysema-dominant subgroup . Emphysema-dominant subtype patients showed no improvement in FEV(1 ) or dyspnea after the 3-month treatment period . CONCLUSION The responses to 3 months of combined inhalation of long-acting beta-agonist and corticosteroid differed according to COPD subtype Background : There is increasing recognition that question naires of health status and lung density measurements are more sensitive tools for assessing progression of emphysema than forced expiratory volume in 1 second ( FEV1 ) and transfer coefficient ( Kco ) . A study was undertaken to investigate prospect ively the correlation between annual change in health status and computer tomography ( CT ) derived lung density in subjects with α1-antitrypsin deficiency . Methods : Twenty two patients of mean ( SD ) age 40.7 ( 9.2 ) years with ZZ type α1-antitrypsin deficiency were investigated at baseline and 30 months later by FEV1 and Kco , St George Respiratory Question naire ( SGRQ ) , and by a spiral CT scan of the chest . CT data of chest images were analysed using software design ed for automated lung contour detection and lung density measurements . The density data were corrected for changes in inspiration levels . Results : Changes in lung density , expressed as 15th percentile point or relative area below –950 HU , correlated well with changes in health status ( SGRQ total score ) : R = −0.56 , p = 0.007 or R = 0.6 , p = 0.003 . Neither changes in health status nor changes in lung density correlated significantly with changes in FEV1 or changes in Kco . Conclusions : The SGRQ total score ( which is a global measure in COPD ) and lung density ( a specific measure of emphysema ) are sensitive to deterioration in patients with α1-antitrypsin deficiency . This finding may facilitate future studies with new drugs specific for emphysema , a frequently occurring component of COPD The objective was to evaluate the effect of inhaled corticosteroids on disease progression in smokers with moderate to severe chronic obstructive pulmonary disease ( COPD ) , as assessed by annual computed tomography ( CT ) using lung density ( LD ) measurements . Two hundred and fifty-four current smokers with COPD were r and omised to treatment with either an inhaled corticosteroids ( ICS ) , budesonide 400 μ g bid , or placebo . COPD was defined as FEV1 ≤ 70 % pred , FEV1/FVC ≤ 60 % and no reversibility to β2-agonists and oral corticosteroids . The patients were followed for 2–4 years with biannual spirometry and annual CT and comprehensive lung function tests ( LFT ) . CT images were analysed using Pulmo-CMS software . LD was derived from a pixel-density histogram of the whole lung as the 15thpercentile density ( PD15 ) and the relative area of emphysema at a threshold of −910 Hounsfield units ( RA-910 ) , and both were volume-adjusted to predicted total lung capacity . At baseline , mean age was 64 years and 64 years ; mean number of pack-years was 56 and 56 ; mean FEV1 was 1.53 L ( 51 % pred ) and 1.53 L ( 53 % pred ) ; mean PD15 was 103 g/L and 104 g/L ; and mean RA-910 was 14 % and 13 % , respectively , for the budesonide and placebo groups . The annual fall in PD15 was −1.12 g/L in the budesonide group and −1.81 g/L in the placebo group ( p = 0.09 ) ; the annual increase in RA–910 was 0.4 % in the budesonide group and 1.1 % in the placebo group ( p = 0.02 ) . There was no difference in annual decline in FEV1 between ICS ( −54 mL ) and placebo ( −56 mL ) ( p = 0.89 ) . Long-term budesonide inhalation shows a non-significant trend towards reducing the progression of emphysema as determined by the CT-derived 15th percentile lung density from annual CT scans in current smokers with moderate to severe COPD
11,086
23,668,900
The results indicated that depression was related to one type of affective empathy . Specifically , depression was related to high levels of empathic stress but not to abnormal empathic concern . Further , depression was related to limited cognitive empathy , as indicated by poor perspective taking , theory of mind , and empathic accuracy . Poor performance on the more objective laboratory tasks might partially be explained by the broader cognitive deficits commonly observed in depression . Empathic abilities may be impaired in depression . The relation between empathy , depression , and gender is unclear . Insight into impaired empathy in depression may not only help explain poor social functioning in MDD but also benefit clinician-patient interactions
BACKGROUND Depression is associated with problems in social functioning . Impaired empathic abilities might underlie this association . Empathy is a multidimensional construct and involves both affective and cognitive processes . We review ed the literature to find out to what extent depression may be associated with abnormal levels of affective and cognitive empathy . We also explored potential gender differences in these associations .
We investigated the effects of emotion perception training on depressive symptoms and mood in young adults reporting high levels of depressive symptoms ( trial registration : IS RCT N02532638 ) . Participants were r and omised to an intervention procedure design ed to increase the perception of happiness over sadness in ambiguous facial expressions or a control procedure , and completed self-report measures of depressive symptoms and mood . Those in the intervention condition had lower depressive symptoms and negative mood at 2-week follow-up , but there was no statistical evidence for a difference . There was some evidence for increased positive mood . Modification of emotional perception may lead to an increase in positive affect Previous studies of gender differences in the phenomenology of depression have focused mostly on symptoms as measured by self-report question naires or clinician-rated scales . In this study , we examined gender differences in the interpersonal behavior of depressed patients by using ethological techniques which involve direct observation of behavior . The nonverbal behavior of 72 nondepressed volunteers and 68 patients with a DSM-III-R diagnosis of nonpsychotic unipolar depression was videorecorded during clinical interviews and scored according to an ethological scoring system including 37 behavior patterns , mostly facial expressions and h and movements . Both male and female depressed patients showed a global restriction of nonverbal expressiveness reflecting a tendency towards social withdrawal . Nonverbal expression of hostility was the only behavioral category on which depressed patients scored higher than nondepressed volunteers . Even though clinical status exerted marked effects on the ethological profile , depression did not obscure some important differences in the nonverbal behavior of males and females . As a group , depressed women showed more socially interactive behaviors than depressed men . Their modality of interacting included higher levels both of nonverbal hostility and of su bmi ssive and affiliative behaviors . These results are discussed in view of clinical data indicating a relationship between gender , style of social interaction and response to antidepressant drugs The goal of this investigation was to identify microlevel processes in the support provider that may foster or inhibit the provision of spousal support . Specifically , the authors focused on ( a ) how emotional similarity between the support provider and support seeker and ( b ) how empathic accuracy of the support provider relate to support provision in marriage . In a laboratory experiment , 30 couples were r and omly assigned to 1 of 2 conditions ( support provider : man vs. woman ) of a factorial design . The couples provided question naire data and participated in a social support interaction design ed to assess behaviors when offering and soliciting social support . A video- review task was used to assess emotional similarity and empathic accuracy during the support interaction . As expected , greater similarity between the support provider 's and support seeker 's emotional responses , as well as more accurate insights into the support-seeking spouse 's thoughts and feelings were found to be predictive of more skilful support ( i.e. , higher levels of emotional and instrumental support and lower levels of negative types of support ) OBJECTIVE Schizophrenia patients often exhibit impairments in facial affect recognition which contribute to their poor social functioning . These impairments are stable in the course of the disorder and seem not to be affected by conventional treatment . The present study investigates the efficacy and specificity of a new training program for the remediation of such impairments . METHOD A newly developed training program tackling affect recognition ( TAR ) was compared with a cognitive remediation training program ( CRT ) and treatment as usual ( TAU ) within a r and omized three group pre-post design in n=77 post-acute schizophrenia patients . The TAR is a computer-aided 12-session program focussing on facial affect recognition , whereas the CRT aims to improve attention , memory and executive functioning . Facial affect recognition , face recognition , and neurocognitive performance were assessed before ( T0 ) and after ( T1 ) the six week training phase . During the training period all patients received antipsychotic medication . RESULTS Patients under TAR significantly improved in facial affect recognition , with recognition performance after training approaching the level of healthy controls from former studies . Patients under CRT and those without special training ( TAU ) did not improve in affect recognition , though patients under CRT improved in verbal memory functions . CONCLUSION According to these results , remediation of disturbed facial affect recognition in schizophrenia patients is possible , but not achievable with a traditional cognitive rehabilitation program such as the CRT . Instead , functional specialized remediation programs such as the newly developed TAR are a more suitable option
11,087
25,202,884
On the basis of the literature , cancer patients experience between two-fold and 20-fold higher risk of developing VTE than noncancer patients . They are more likely to experience a VTE event during the first 3 - 6 months after cancer diagnosis . In addition , an increased risk of VTE in patients with distant metastases and certain types of cancer ( i.e. pancreatic or lung ) was revealed . VTE was found to be a leading cause of mortality in cancer patients . The annual average total cost for cancer patients with VTE was found to be almost 50 % higher than that of cancer patients without VTE .
The objective of this study was to present evidence on the epidemiology , health outcomes and economic burden of cancer-related venous thromboembolism ( VTE ) .
BACKGROUND To investigate the incidence , risk factors and clinical implication s of venous thromboembolism ( VTE ) in advanced cancer patients treated in phase I studies . PATIENTS AND METHODS Patients enrolled and treated in phase I studies conducted by SENDO ( Southern Europe New Drugs Organization ) Foundation between 2000 and 2010 in 15 experimental centers were considered for the study . Clinical data , including adverse events , were prospect ively collected during the studies and retrospectively pooled for VTE analysis . RESULTS Data of 1415 patients were considered for analysis . Five hundred and twenty-six ( 37.2 % ) patients were males , and median age was 57.3 years ( range : 13 - 85 ) . Eighty-five percent of patients had metastatic disease , while the remaining had locally advanced irresectable disease . For 706 ( 49.9 % ) of the patients , the study treatment was with cytotoxic agent(s ) only , for 314 with target therapy(ies ) only , while the remaining patients received a target therapy in combination with a cytotoxic drug . Fifty-six ( 3.96 % ) patients who developed a VTE , almost all ( 89.3 % ) during the course of treatment , the remaining during the follow-up . At univariate analysis , the Khorana score , the combination of an antiangiogenic agent with a cytotoxic drug , and the time from first cancer diagnosis to study entry ( as continuous variable ) were associated with a statistically significant increase of VTE occurrence . The multivariate analysis confirmed only a statistically significant association for the Khorana score . The hazard ratio of VTE occurrence was 7.88 [ 95 % confidence interval ( CI ) 2.86 - 21.70 ) and 2.74 ( 95 % CI 1.27 - 5.92 ) times higher for the highest ( ≥3 ) and intermediate ( 1 - 2 ) scores as compared with score = 0 . CONCLUSIONS VTE is a relatively common complication among patients treated in the context of phase I studies . The Khorana score predicts VTE development and can be used to identify patients at high of VTE BACKGROUND Several studies have reported that patients who present with idiopathic deep vein thrombosis ( DVT ) have an increased risk of subsequently developing cancer . A clinical trial had previously been conducted examining the optimal duration of oral anticoagulant therapy following initial heparin treatment in patients with proximal DVT . METHODS A historical cohort study was performed on patients enrolled in the duration of anticoagulant trial . Patients known to have cancer at the time of entry into the trial were excluded . The qualifying DVTs were classified as idiopathic ( no known associated risk factors ) or secondary without knowledge of subsequent recurrent venous thrombosis or cancer . The patients were then followed for the development of cancer . RESULTS Thirteen ( 8.6 % ) of the 152 patients in the idiopathic cohort subsequently developed cancer compared to eight ( 7.1 % ) of 112 patients in the secondary cohort , P = 0.86 . Two ( 5.4 % ) of 37 patients with recurrent venous thromboembolism and 19 ( 8.4 % ) of 227 patients without recurrent thromboembolism developed cancer , P = 0.7 . CONCLUSION Our study did not detect an increased risk of subsequent cancer in patients presenting with idiopathic DVT compared to secondary DVT ; nor did we detect an increased incidence of cancer in patients with recurrent venous thromboembolism . Further studies are required prior to pursuing a policy of aggressive screening for cancer in patients with idiopathic venous thromboembolism Summary Background Data : The epidemiology of venous thromboembolism ( VTE ) after cancer surgery is based on clinical trials on VTE prophylaxis that used venography to screen deep vein thrombosis ( DVT ) . However , the clinical relevance of asymptomatic venography-detected DVT is unclear , and the population of these clinical trials is not necessarily representative of the overall cancer surgery population . Objective : The aim of this study was to evaluate the incidence of clinical ly overt VTE in a wide spectrum of consecutive patients undergoing surgery for cancer and to identify risk factors for VTE . Methods : @RISTOS was a prospect i ve observational study in patients undergoing general , urologic , or gynecologic surgery . Patients were assessed for clinical ly overt VTE occurring up to 30 ± 5 days after surgery or more if the hospital stay was longer than 35 days . All outcome events were evaluated by an independent Adjudication Committee . Results : A total of 2373 patients were included in the study : 1238 ( 52 % ) undergoing general , 685 ( 29 % ) urologic , and 450 ( 19 % ) gynecologic surgery . In-hospital prophylaxis was given in 81.6 % and postdischarge prophylaxis in 30.7 % of the patients . Fifty patients ( 2.1 % ) were adjudicated as affected by clinical ly overt VTE ( DVT , 0.42 % ; nonfatal pulmonary embolism , 0.88 % ; death 0.80 % ) . The incidence of VTE was 2.83 % in general surgery , 2.0 % in gynecologic surgery , and 0.87 % in urologic surgery . Forty percent of the events occurred later than 21 days from surgery . The overall death rate was 1.72 % ; in 46.3 % of the cases , death was caused by VTE . In a multivariable analysis , 5 risk factors were identified : age above 60 years ( 2.63 , 95 % confidence interval , 1.21–5.71 ) , previous VTE ( 5.98 , 2.13–16.80 ) , advanced cancer ( 2.68 , 1.37–5.24 ) , anesthesia lasting more than 2 hours ( 4.50 , 1.06–19.04 ) , and bed rest longer than 3 days ( 4.37 , 2.45–7.78 ) . Conclusions : VTE remains a common complication of cancer surgery , with a remarkable proportion of events occurring late after surgery . In patients undergoing cancer surgery , VTE is the most common cause of death at 30 days after surgery 9572 Background : To assess the thrombogenic role of adjuvant chemotherapy , following radical surgical resection , in ambulatory cancer patients , is of particular interest , because of the little or no tumor burden . The relation between adjuvant chemotherapy and thrombosis has been investigated in breast cancer . Limited information is available in GI cancers . Aim of this study was to investigate the acquired and inherited risk factors for VTE and the incidence of symptomatic VTE in patients on adjuvant chemotherapy for breast or GI cancer . METHODS In a prospect i ve observational study ( January 2003 and February 2006 ) , 199 GI ( 82F/117 M ; age range , 26 - 84 years ) and 182 breast ( 180F/2 M ; age range , 29 - 85 years ) cancer patients were enrolled and followed-up for symptomatic VTE during adjuvant chemotherapy . We prospect ively evaluated the effect of acquired ( i.e. age , chemotherapy , tumor hystotype , history of thrombosis , body mass index and smoke ) and inherited risk factors ( i.e. antithrombin , protein C , protein S , homocysteine , activated protein C [ APC ] resistance , factor V Leiden [ FVL ] and Prothrombin [ PT ] mutations ) . RESULTS Overall 30 VTE events ( 7.87 % ) were recorded : 28 ( 7.35 % ) during treatment and 2 ( 0.52 % ) during the subsequent follow-up . Among all the 381 cancer patients , FVL was detected in 14 cases ( 3.67 % ) and PT mutation in 10 cases ( 2.62 % ) . At multivariate analysis thrombocytosis ( HR 2.8 ; 95 % CI , 1.13 - 7.30 , P < 0.026 ) and a previous episode of thrombosis ( HR 12.4 ; 95 % CI , 2.48 - 62.6 , P < 0.0026 ) were significantly associated to the development of VTE . There was no association between the FVL or PT mutations and the risk of VTE . CONCLUSIONS Our data demonstrate that , in the adjuvant setting , most VTE occur during therapy . Thrombocytosis prechemotherapy and an history of VTE strongly predict the developing of VTE in the adjuvant setting . No significant financial relationships to disclose INTRODUCTION Systemic chemotherapy and surgery for patients with recurrent ovarian cancer ( ROC ) constitute a therapeutic challenge . Venous thromboembolism ( VTE ) seems to have a negative prognostic impact in patients with solid tumors including primary ovarian cancer in many series . Only limited contemporary data exist regarding the impact of VTE on ROC . PATIENTS AND METHODS Two large multicenter prospect i ve controlled phase I/II-III studies on 2nd-line topotecan-based chemotherapy with platinum-sensitive or resistant ROC ( N=525 ) were conducted on both operated and non-operative patients by the North-Eastern German Society of Gynaecologic Oncology Ovarian Cancer Study Group ( NOGGO ) . Analysis was performed to identify incidence , predictors and prognosis of VTE . Survival analysis , univariate and Cox-regression analysis were performed to identify independent predictors of VTE , overall and progression free survival . RESULTS Thirty-seven ( 7 % ) VTE-episodes during chemotherapy were identified ; 70 % of them occurred within the first 2 months after initiation of chemotherapy . Ascites , as a sign of peritoneal carcinomatosis and advanced tumor disease , was identified as independent predictor of VTE . Advanced age and high BMI did not appear to affect significantly the VTE-incidence . High performance status , platinum-sensitivity , serous-papillary histology , lack of ascites and surgery appeared to positively affect survival by multivariate analysis . Overall survival and progression free survival were similar between the VTE and no-VTE patients . CONCLUSION ROC- patients appear to have the highest risk for developing VTE when ascites exists and during the first 2 months following chemotherapy initiation . In contrast to primary ovarian cancer , VTE could not be identified to affect overall survival in relapsed malignant ovarian disease Cancer and its treatment are recognized risk factors for VTE . Compliance rate with published VTE prophylaxis guidelines is low . Decision on when to offer prophylaxis for hospitalized cancer patients is difficult to make . This paper describes current clinical practice in offering VTE prophylaxis to hospitalized cancer patients . Prophylaxis rate and rate of VTE will be correlated with the risk level . We prospect ively followed all consecutive adult cancer patients admitted to medical units over a 5-month period . Caprini risk assessment model , with some modifications , was utilized to determine risk of VTE . Six hundred and six patients ( 51 % males , median age 52 years , range 18–91 ) were included . Reasons for admission included infections ( 25 % ) , chemotherapy ( 22 % ) and palliative care ( 10 % ) . In addition to cancer , the most frequently encountered risk factors for VTE were : Immobilization ( 35 % ) , age > 60 years ( 31 % ) and body mass index > 30 in ( 20 % ) . Patients were grouped according to their total risk score : low ( 9 % ) , moderate ( 44 % ) and high risk ( 47 % ) . VTE prophylaxis rate was 55.1 % for the whole study group . Following discharge , patients were followed for 60 days . The incidence of VTE was 3.4 % in the moderate and 4.2 % in the high risk groups , while none in the low risk group developed VTE . Many additional risk factors for VTE are usually encountered in hospitalized cancer patients . Cancer alone may not be an enough reason for VTE prophylaxis . Risk assessment model able to stratify patients into different risk categories will simplify decision making and enhance VTE prophylaxis rate Introduction : The risk of symptomatic deep vein thrombosis ( DVT ) among patients with non-small cell lung cancer ( NSCLC ) has not been well studied . We conducted a retrospective cohort study of patients with NSCLC to determine the incidence of DVT and to characterize predictors of DVT in patients with NSCLC . Methods : The pulmonary oncology data base of the Sir Mortimer B. Davis – Jewish General Hospital contains prospect ively collected clinical data on lung cancer patients since January 1 , 1997 . We identified all consecutive patients with histologically confirmed new diagnoses of NSCLC between January 1 , 1997 and December 31 , 2004 , and we determined the occurrence of an objective ly defined DVT . Data on clinical and tumor characteristics were collected and compared among patients with DVT and patients without DVT . Results : Of the 493 NSCLC patients included in the cohort for a total of 634 person-years , 67 ( 13.6 % ) patients developed objective ly confirmed DVTs , with an incidence of 110 cases ( 95 % confidence interval [ CI ] 80 , 130 ) per 1000 person-years . An adjusted multivariable regression analysis showed that advanced stage ( rate ratio [ RR ] 2.63 , 95 % CI 1.38 , 5.00 ) and male sex ( RR 1.75 , 95 % CI 1.03–2.94 ) were independent predictors of DVT . Conclusions : Our results show a high incidence of DVT in NSCLC patients . Advanced stage and , to a lesser extent , male sex , are important predictors of DVT . Trials to evaluate the use of prophylactic anticoagulant treatments in patients with NSCLC should be conducted BACKGROUND Venous thrombosis with and without pulmonary embolism is a frequent complication of malignancies and second among the causes of death in tumour patients . Its incidence is reported to be 10 to 15 % . Since for method ological reasons , this rate can be assumed to be too low and to disregard asymptomatic venous thrombosis , a combined retrospective and prospect i ve study was performed to examine the actual frequency of venous thrombosis in tumour patients . PATIENTS AND METHODS The histories of 409 patients ( 175 women , 234 men , mean age 69 years [ 19 to 96 years ] ) with different tumours , consecutively enrolled in the order of their altogether 426 inpatient treatments , were checked in retrospect for the frequency of venous thrombosis and pulmonary embolism . Subsequently , 97 tumour in patients ( 36 women , 61 men , mean age 70 years [ 42 to 90 years ] ) were systematic ally screened , by means of duplex sonography and /or venography , for venous thromboses in the veins of the pelvis and both legs . RESULTS In the retrospective analysis , where no systematic screening for thromboses was performed and only symptomatic thrombosis was recorded , venous thrombosis was found in 6.6 % of all tumour patients , whereas in the prospect i ve examination with systematic duplex sonography and / or venography of all patients , the percentage was 33 % . In the prospect i ve study , 31.3 % of venous thromboses were symptomatic and 68.7 % asymptomatic . In 39.3 % of the cases in the retrospective analysis and 25 % in the prospect i ve analysis , venous thrombosis occurred during chemotherapy , surgery or radiation therapy . Venous thrombosis was most often seen in metastasizing tumours and in colorectal carcinoma ( 40 % ) , haematological system diseases ( 28.6 % ) , gastric cancer ( 30 % ) , bronchial , pancreas and ovarian carcinoma ( 28.6 % ) , and carcinoma of the prostate ( 16.7 % ) . CONCLUSION Regular screening for thrombosis is indicated even in asymptomatic tumour patients because asymptomatic venous thrombosis is frequent , can lead to pulmonary embolism and has to be treated like symptomatic venous thrombosis . This is particularly true for metastasization during chemotherapy , surgical interventions , or radiation BACKGROUND Data on venous thromboembolism ( VTE ) in gastric cancer ( GC ) are very scarce . OBJECTIVE To investigate the incidence , risk factors and prognostic implication s of VTE in Asian GC patients . METHODS Prospect i ve data bases containing clinical information on GC patients ( n = 2,085 ) were used . RESULTS The 2-year cumulative incidences of all VTE events were 0.5 % , 3.5 % and 24.4 % in stages I , II-IV(M0 ) and IV(M1 ) , respectively . Advanced stage , older age and no major surgery were independent risk factors for developing VTE . When the VTE cases were classified into extremity venous thrombosis ( EVT ) , pulmonary thromboembolism ( PTE ) or intra-abdominal venous thrombosis ( IVT ) , IVTs ( 62 % ) were more common than EVTs ( 21 % ) or PTEs ( 17 % ) . Although peri-operative pharmacologic thromboprophylaxis was not routinely administered , the VTE incidence after major surgery was only 0.2 % . During chemotherapy , EVT/PTE developed more frequently than IVT ( 54 % vs. 19 % ) ; however , during untreated or treatment-refractory periods , IVT developed more frequently than EVT/PTE ( 69 % vs. 36 % ) . In multivariate models , the development of EVT/PTE was a significant predictor of early death when compared with no occurrence of VTE ( P < 0.05 ) . However , IVT did not affect survival . CONCLUSION This is the largest study that specially focused on VTE in GC and the VTE incidence in Asian GC patients was first demonstrated . Considering the low incidence of post-operative VTE development , the necessity of peri-operative pharmacologic thromboprophylaxis should be evaluated separately in Asian patients . The clinical situation of the development of EVT/PTE and IVT differed . Only EVT/PTE had an adverse effect on survival and IVT had no prognostic significance Objectives : To determine whether a postoperative venous thromboembolism ( VTE ) is associated with a worse prognosis and /or a more advanced cancer stage and to evaluate the association between a postoperative VTE and cancer-specific survival when known prognostic factors , such as age , stage , cancer type , and type of surgery , are controlled . Context : It is unknown whether oncology patients who develop a venous thromboembolism after a complete curative resection are at the same survival disadvantage as oncology patients with a spontaneous VTE . Methods : A retrospective case control study was conducted at Memorial Sloan-Kettering Cancer Center . Years of study : January 1 , 2000 , to December 31 , 2005 . Median follow-up : 24.9 months ( Interquartile range 13.0 , 43.0 ) . All cancer patients who underwent abdominal , pelvic , thoracic , or soft tissue procedures and those who developed a VTE within 30 days of the procedure were identified from a prospect i ve morbidity and mortality data base . Overall survival ( OS ) was calculated for the entire cohort . In the matched cohort , OS and disease-specific survival ( DSS ) were calculated for stages 0 to 3 and stages 0 to 2 . Results : A total of 23,541 cancer patients underwent an invasive procedure and 474 ( 2 % ) had a postoperative VTE . VTE patients had a significantly worse 5-year OS compared to no-VTE patients ( 43.8 % vs 61.2 % ; P < 0.0001 ) ; 205 VTE patients ( stages 0–3 ) were matched to 2050 controls by age , sex , cancer type , stage , and surgical procedure . In this matched analysis , VTE patients continued to demonstrate a significantly worse prognosis with an inferior 5-year OS ( 54.7 % vs 66.3 % ; P < 0.0001 ) and DSS ( 67.8 % vs 79.5 % ; P = 0.0007 ) as compared to controls . The survival difference persisted in early stage disease ( stage 0–2 ) , with 5-year DSS of 82.9 % versus 87.3 % ( P = 0.01 ) . Conclusions : Postoperative VTE in oncology patients with limited disease and a complete surgical resection is associated with an inferior cancer survival . A postoperative VTE remains a poor prognostic factor , even when controlling for age , stage , cancer type , and surgical procedure further supporting an independent link between hypercoagulability and cancer survival A small proportion of patients with deep vein thrombosis develop recurrent venous thromboembolic complications or bleeding during anticoagulant treatment . These complications may occur more frequently if these patients have concomitant cancer . This prospect i ve follow-up study sought to determine whether in thrombosis patients those with cancer have a higher risk for recurrent venous thromboembolism or bleeding during anticoagulant treatment than those without cancer . Of the 842 included patients , 181 had known cancer at entry . The 12-month cumulative incidence of recurrent thromboembolism in cancer patients was 20.7 % ( 95 % CI , 15.6%-25.8 % ) versus 6.8 % ( 95 % CI , 3.9%- 9.7 % ) in patients without cancer , for a hazard ratio of 3.2 ( 95 % CI , 1.9 - 5.4 ) The 12-month cumulative incidence of major bleeding was 12.4 % ( 95 % CI , 6.5%-18.2 % ) in patients with cancer and 4.9 % ( 95 % CI , 2.5%-7.4 % ) in patients without cancer , for a hazard ratio of 2.2 ( 95 % CI , 1.2 - 4.1 ) . Recurrence and bleeding were both related to cancer severity and occurred predominantly during the first month of anticoagulant therapy but could not be explained by sub- or overanticoagulation . Cancer patients with venous thrombosis are more likely to develop recurrent thromboembolic complications and major bleeding during anticoagulant treatment than those without malignancy . These risks correlate with the extent of cancer . Possibilities for improvement using the current paradigms of anticoagulation seem limited and new treatment strategies should be developed BACKGROUND The clinical significance of incidental venous thrombosis ( IVT ) is uncertain . The objective of this study was to compare the clinical characteristics and the outcome of cancer patients with IVT with those of patients with symptomatic venous thrombosis ( SVT ) . PATIENTS AND METHODS Prospect i ve observational study enrolling consecutive cancer patients newly diagnosed with venous thromboembolism ( May 2006-April 2009 ) . Diagnosis of IVT was based on vascular filling defects in scheduled computed tomography scans in the absence of clinical symptoms . Anticoagulant therapy was routinely prescribed regardless of SVT or IVT . RESULTS IVT was diagnosed in 94 out of 340 ( 28 % ) patients . Patients with IVT were older ( 63.7 ± 10.5 versus 60.8 ± 10.5 years , P = 0.035 ) , more frequently had metastatic cancer ( 82 % versus 65 % , P = 0.01 ) and were less likely to be receiving chemotherapy at the time of the thrombotic event ( 53 % versus 67 % , P = 0.018 ) . Mean follow-up was 477 days . A lower risk of venous rethromboses was observed in patients with IVT ( log-rank P = 0.043 ) , with no differences in major bleeding and overall survival compared with SVT patients . CONCLUSIONS A high proportion of venous thrombotic events in cancer patients are diagnosed incidentally during scheduled imaging . Prospect i ve controlled trials evaluating the optimal therapy in this setting are required Abstract Objective To determine the efficacy and safety of the anticoagulant fondaparinux in older acute medical in patients at moderate to high risk of venous thromboembolism . Design Double blind r and omised placebo controlled trial . Setting 35 centres in eight countries . Participants 849 medical patients aged 60 or more admitted to hospital for congestive heart failure , acute respiratory illness in the presence of chronic lung disease , or acute infectious or inflammatory disease and expected to remain in bed for at least four days . Interventions 2.5 mg fondaparinux or placebo subcutaneously once daily for six to 14 days . Outcome measure The primary efficacy outcome was venous thromboembolism detected by routine bilateral venography along with symptomatic venous thromboembolism up to day 15 . Secondary outcomes were bleeding and death . Patients were followed up at one month . Results 425 patients in the fondaparinux group and 414 patients in the placebo group were evaluable for safety analysis ( 10 were not treated ) . 644 patients ( 75.9 % ) were available for the primary efficacy analysis . Venous thrombembolism was detected in 5.6 % ( 18/321 ) of patients treated with fondaparinux and 10.5 % ( 34/323 ) of patients given placebo , a relative risk reduction of 46.7 % ( 95 % confidence interval 7.7 % to 69.3 % ) . Symptomatic venous thromboembolism occurred in five patients in the placebo group and none in the fondaparinux group ( P = 0.029 ) . Major bleeding occurred in one patient ( 0.2 % ) in each group . At the end of follow-up , 14 patients in the fondaparinux group ( 3.3 % ) and 25 in the placebo group ( 6.0 % ) had died . Conclusion Fondaparinux is effective in the prevention of asymptomatic and symptomatic venous thromboembolic events in older acute medical patients . The frequency of major bleeding was similar for both fondaparinux and placebo treated patients PURPOSE This study was design ed to determine the incidence of venous and arterial thromboembolic events ( TEEs ) in patients treated with cisplatin-based chemotherapy and to analyze the prognostic value of patients ' baseline and treatment characteristics in predicting TEE occurrence . PATIENTS AND METHODS We performed a large retrospective analysis of all patients treated with cisplatin-based chemotherapy for any type of malignancy at Memorial Sloan-Kettering Cancer Center in 2008 . A TEE was cisplatin-associated if it occurred between the time of the first dose of cisplatin and 4 weeks after the last dose . RESULTS Among 932 patients , 169 ( 18.1 % ) experienced a TEE during treatment or within 4 weeks of the last dose . TEEs included deep vein thrombosis ( DVT ) alone in 49.7 % , pulmonary embolus ( PE ) alone in 25.4 % , DVT plus PE in 13.6 % , arterial TEE alone in 8.3 % , or DVT plus arterial TEE in 3.0 % . TEEs occurred within 100 days of initiation of treatment in 88 % of patients . By univariate analysis , sex , age , race , Karnofsky performance status ( KPS ) , exposure to erythropoiesis-stimulating agents , presence of central venous catheter ( CVC ) , site of cancer , stage of cancer , leukocyte and hemoglobin levels , and Khorana score were all identified as risk factors . However , by multivariate analysis , only age , KPS , presence of CVC , and Khorana score retained significance . CONCLUSION This large retrospective analysis confirms the unacceptable incidence of TEEs in patients receiving cisplatin-based chemotherapy . In view of the controversy associated with prophylactic anticoagulation in patients with cancer treated with chemotherapy , r and omized studies are urgently needed in this specific cancer population treated with cisplatin-based regimens BACKGROUND The incidence of venous thromboembolism after diagnosis of specific cancers and the effect of thromboembolism on survival are not well defined . METHODS The California Cancer Registry was linked to the California Patient Discharge Data Set to determine the incidence of venous thromboembolism among cancer cases diagnosed between 1993 and 1995 . The incidence and timing of thromboembolism within 1 and 2 years of cancer diagnosis and the risk factors associated with thromboembolism and death were determined . RESULTS Among 235 149 cancer cases , 3775 ( 1.6 % ) were diagnosed with venous thromboembolism within 2 years , 463 ( 12 % ) at the time cancer was diagnosed and 3312 ( 88 % ) subsequently . In risk-adjusted models , metastatic disease at the time of diagnosis was the strongest predictor of thromboembolism . Expressed as events per 100 patient-years , the highest incidence of thromboembolism occurred during the first year of follow-up among cases with metastatic-stage pancreatic ( 20.0 ) , stomach ( 10.7 ) , bladder ( 7.9 ) , uterine ( 6.4 ) , renal ( 6.0 ) , and lung ( 5.0 ) cancer . Adjusting for age , race , and stage , diagnosis of thromboembolism was a significant predictor of decreased survival during the first year for all cancer types ( hazard ratios , 1.6 - 4.2 ; P<.01 ) . CONCLUSIONS The incidence of venous thromboembolism varied with cancer type and was highest among patients initially diagnosed with metastatic-stage disease . The incidence rate of thromboembolism decreased over time . Diagnosis of thromboembolism during the first year of follow-up was a significant predictor of death for most cancer types and stages analyzed . For some types of cancer , the incidence of thromboembolism was sufficiently high to warrant prospect i ve clinical trials of primary thromboprophylaxis BACKGROUND Pancreatic adenocarcinoma is one of the cancers most frequently associated with venous thromboembolism ( VTE ) , with the varying incidences of 10%-20 % reported in Western countries . Asians are known to have a lower risk for VTE than Caucasians , but few studies have been conducted regarding the incidence of VTE in Asian cancer patients . MATERIAL S AND METHODS Incidence of radiologically confirmed VTE was assessed by review of medical records of all patients histologically diagnosed with advanced pancreatic adenocarcinoma , with follow-up conducted at a regional teaching hospital from Jun 2003 to Dec 2005 . RESULTS Seventy five patients with advanced pancreatic adenocarcinoma were identified for evaluation ( M : F=44:31 , locally advanced : metastatic=25:50 , median age:67 years ) . Four patients ( 5.3 % ) developed VTE during median follow-up period of 124 days . Three of four patients had metastatic disease and were receiving chemotherapy when VTE developed . The mean time from cancer diagnosis to the detection of VTE was 160 days . No episodes of peripheral arterial thrombosis were detected , but three multiple cerebral infa rct ions occurred , which proved fatal in all three . Median survival time was shorter in patients with VTE than those without , but the difference was not statistically significant ( 4.3 vs 6.6 months ) . CONCLUSION The incidence of VTE complications in Korean patients with advanced pancreatic cancer was 5.3 % , which is lower than that observed in other ethnic groups . Our study warrants further prospect i ve investigations on the incidence and mechanism of VTE and cerebral infa rct ions in cancer patients of different ethnic groups AIM OF THE STUDY To investigate the incidence and clinical implication s of venous thromboembolism ( VTE ) in advanced colorectal cancer ( ACC ) patients treated and followed-up through a prospect i ve r and omised trial , comparing FOLFIRI chemotherapy given as an intermittent or as a continuous schedule . PATIENTS , MATERIAL S AND METHODS A total of 266 patients were r and omised by 15 experimental centres : 168 ( 63.2 % ) were males , median age : 64.6 years , age range : 37 - 76 years . Almost all ( 95.5 % ) patients had metastatic disease , while the remainder were classified with locally advanced irresectable disease . For 138 ( 51.9 % ) of the patients , the chemotherapy treatment was intermittent FOLFIRI and the remaining patients received continuous treatment . All toxicities , including VTE , were prospect ively collected . RESULTS During the study protocol , the central data management gathered two cases of VTE . Our analysis retrieved 27 ( 10.2 % ) patients who developed a VTE , almost all ( 89 % ) during the course of chemotherapy treatment : 20 out of 27 during FOLFIRI , the remaining 7 during following lines or follow-up . VTE was the most frequent grade 3/4 toxicity . The incidence of VTE was significantly increased in the patients receiving continuous rather than intermittent treatment ( HR 2.67 , 95 % CI 1.17 - 6.10 ; p<0.02 ) . CONCLUSION VTE is a common complication among advanced colorectal cancer patients and yet this type of toxicity is widely underestimated . In this r and omised trial , VTE was the most frequent grade 3/4 toxicity . Use of an intermittent schedule is associated with a reduced risk of developing VTE BACKGROUND Annualised figures show an up to 7-fold higher incidence of vascular thromboembolism ( VTE ) in patients with advanced pancreatic cancer ( APC ) compared to other common malignancies . Concurrent VTE has been shown to confer a worse overall prognosis in APC . METHODS One hundred and twenty three APC patients were r and omised to receive either gemcitabine 1000 mg/m(2 ) or the same with weight-adjusted dalteparin ( WAD ) for 12 weeks . Primary end-point was the reduction of all-type VTE during the study period . NCT00462852 , IS RCT N : 76464767 . FINDINGS The incidence of all-type VTE during the WAD treatment period ( < 100 days from r and omisation ) was reduced from 23 % to 3.4 % ( p = 0.002 ) , with a risk ratio (RR)of 0.145 , 95 % confidence interval ( CI ) ( 0.035 - 0.612 ) and an 85 % risk reduction . All-type VTE throughout the whole follow-up period was reduced from 28 % to 12 % ( p = 0.039 ) , RR = 0.419 , 95 % CI ( 0.187 - 0.935 ) and a 58 % risk reduction . Lethal VTE < 100 days was seen only in the control arm , 8.3 % compared to 0 % ( p = 0.057 ) , RR = 0.092 , 95 % CI ( 0.005 - 1.635 ) . INTERPRETATION Weight adjusted dalteparin used as primary prophylaxis for 12 weeks is safe and produces a highly significant reduction of all-type VTE during the prophylaxis period . The benefit is maintained after dalteparin withdrawal although decreases with time BACKGROUND The efficacy and safety of thromboprophylaxis in patients with acute medical illnesses who may be at risk for venous thromboembolism have not been determined in adequately design ed trials . METHODS In a double-blind study , we r and omly assigned 1102 hospitalized patients older than 40 years to receive 40 mg of enoxaparin , 20 mg of enoxaparin , or placebo subcutaneously once daily for 6 to 14 days . Most patients were not in an intensive care unit . The primary outcome was venous thromboembolism between days 1 and 14 , defined as deep-vein thrombosis detected by bilateral venography ( or duplex ultrasonography ) between days 6 and 14 ( or earlier if clinical ly indicated ) or documented pulmonary embolism . The duration of follow-up was three months . RESULTS The primary outcome could be assessed in 866 patients . The incidence of venous thromboembolism was significantly lower in the group that received 40 mg of enoxaparin ( 5.5 percent [ 16 of 291 patients ] ) than in the group that received placebo ( 14.9 percent [ 43 of 288 patients ] ) ( relative risk , 0.37 ; 97.6 percent confidence interval , 0.22 to 0.63 ; P < 0.001 ) . The benefit observed with 40 mg of enoxaparin was maintained at three months . There was no significant difference in the incidence of venous thromboembolism between the group that received 20 mg of enoxaparin ( 43 of 287 patients [ 15.0 percent ] ) and the placebo group . The incidence of adverse effects did not differ significantly between the placebo group and either enoxaparin group . By day 110 , 50 patients had died in the placebo group ( 13.9 percent ) , 51 had died in the 20-mg group ( 14.7 percent ) , and 41 had died in the 40-mg group ( 11.4 percent ) ; the differences were not significant . CONCLUSIONS Prophylactic treatment with 40 mg of enoxaparin subcutaneously per day safely and effectively reduces the risk of venous thromboembolism in patients with acute medical illnesses The reported incidence of a subsequent diagnosis of malignancy in patients presenting with deep vein thrombosis ( DVT ) varies from 2‐25 % . Risk indicators and diagnostic procedures to be performed in these patients are controversial Background The occurrence of venous thromboembolism ( VTE ) , manifesting as deep vein thrombosis ( DVT ) or pulmonary embolism ( PE ) , after colorectal cancer surgery in Asian patients remains poorly characterized . The present study was design ed to investigate the incidence of symptomatic VTE in Korean colorectal cancer patients following surgery , and to identify the associated risk factors . Methods We retrospectively analyzed data from patients who developed symptomatic VTE after colorectal cancer surgery between 2006 and 2008 . Deep vein thrombosis was diagnosed with Doppler ultrasound or contrast venography , and PE was identified with lung ventilation/perfusion scans or chest computed tomography . Thromboprophylaxis , including low-molecular-weight heparin , graduated compression stockings , and intermittent pneumatic compression , was used in patients considered at high risk of VTE . Results Of the 3,645 patients who underwent colorectal cancer surgery , 31 ( 0.85 % ) developed symptomatic VTE . Of those 31 patients , 23 ( 74.2 % ) had DVT , 16 ( 51.6 % ) had PE , and 8 ( 25.8 % ) had both . Two patients died from PE . Univariate analysis showed that a history of VTE , pre-existing cardiovascular disease , respiratory disease , transfusions , postoperative immobilization time , and postoperative complications were associated with VTE ( p < 0.05 for each ) . Multivariate analysis showed that a history of VTE , pre-existing cardiovascular disease , postoperative complication , advanced cancer stage , and postoperative immobilization time were risk factors for developing symptomatic VTE . The mean hospital stay was 18.3 days , and the mortality rate was 6.5 % . Conclusions The incidences of symptomatic DVT and PE were found to be not low in Asian colorectal cancer surgery patients compared with Western countries . The risk factors for VTE were a history of VTE , pre-existing cardiovascular disease , postoperative complications , advanced cancer stage , and postoperative immobilization . Thromboprophylaxis should be strongly considered in patients with these characteristics . Large prospect i ve r and omized controlled trials should be conducted to further evaluate the risk of VTE in Asian patients , and to determine the optimal prophylaxis Objective : To minimize treatment variations , we have implemented clinical pathways for all breast cancer patients undergoing surgery . We sought to determine the incidence of postoperative venous thromboembolism ( VTE ) in patients treated on these pathways . Summary Background Data : Cancer patients have an increased risk of VTE because of a hypercoagulable state . The risk of VTE following breast cancer surgery is not well established . Methods : We retrospectively review ed prospect ively collected data for all patients who underwent breast cancer surgery and were treated on the clinical pathways with mechanical antiembolism devices and early ambulation in the postoperative period between January 2000 and September 2003 . Results : During the study period , 3898 patients underwent 4416 surgical procedures . Seven patients with postoperative VTE within 60 days were identified , for a rate of 0.16 % per procedure . Six patients presented with only a deep venous thrombosis or a pulmonary embolism ; 1 patient had both . The median time from surgery to diagnosis of VTE was 14 days ( range , 2–60 days ; mean , 22 days ) . No relationship was identified between stage of breast cancer or type of breast surgery and development of VTE . Two ( 29 % ) of the 7 patients with VTE had received neoadjuvant chemotherapy . VTE treatment consisted of subcutaneous low-molecular-weight heparin ( n = 5 ) or intravenous heparin ( n = 2 ) followed by warfarin . There were no deaths . Conclusions : VTE following breast cancer surgery is rare in patients who are treated on clinical pathways with mechanical antiembolism devices and early ambulation in the postoperative period . We conclude that systemic VTE prophylaxis is not indicated in this group of patients BACKGROUND Evidence suggests that cancer patients who develop venous thromboembolism ( VTE ) have a poorer outcome than those who do not . The aim of this prospect i ve study was to assess the incidence of the development of VTE in breast cancer patients commencing chemotherapy and the relationship between development of thrombosis and cancer progression and death . PATIENTS AND METHODS One hundred and thirty-four breast cancer patients were recruited and followed up prior to chemotherapy and at 3 , 6 , 12 and 24 months . Duplex ultrasound imaging ( DUI ) was performed 1 month following commencement of chemotherapy or if patients became symptomatic . RESULTS Thirteen patients developed VTE . Six patients with advanced breast cancer and seven with early breast cancer developed VTE . Three patients died from VTE ; all had advanced breast cancer . In patients with VTE , the 28-day mortality rate was 15 % , but in patients with symptomatic VTE , the 28-day mortality was 22 % . Development of VTE did not predict for progression by three and six months in advanced breast cancer patients . VTE demonstrated a trend for predicting progression by two years . Using Cox regression survival analysis , there was no survival advantage in those with or without VTE . CONCLUSION Although the body of evidence supports a worse prognosis when VTE and cancer coexist as compared to either diagnosis alone , a larger prospect i ve study is required to confirm this and clarify whether any premature death is primarily due to VTE or to more aggressive cancer Background —Considerable variability exists in the use of pharmacological thromboprophylaxis among acutely ill medical patients , partly because clinical ly relevant end points have not been fully assessed in this population . We undertook an international , multicenter , r and omized , double-blind , placebo-controlled trial using clinical ly important outcomes to assess the efficacy and safety of dalteparin in the prevention of venous thromboembolism in such patients . Methods and Results — Patients ( n=3706 ) were r and omly assigned to receive either subcutaneous dalteparin 5000 IU daily or placebo for 14 days and were followed up for 90 days . The primary end point was venous thromboembolism , defined as the combination of symptomatic deep vein thrombosis , symptomatic pulmonary embolism , and asymptomatic proximal deep vein thrombosis detected by compression ultrasound at day 21 and sudden death by day 21 . The incidence of venous thromboembolism was reduced from 4.96 % ( 73 of 1473 patients ) in the placebo group to 2.77 % ( 42 of 1518 patients ) in the dalteparin group , an absolute risk reduction of 2.19 % or a relative risk reduction of 45 % ( relative risk , 0.55 ; 95 % CI , 0.38 to 0.80 ; P=0.0015 ) . The observed benefit was maintained at 90 days . The overall incidence of major bleeding was low but higher in the dalteparin group ( 9 patients ; 0.49 % ) compared with the placebo group ( 3 patients ; 0.16 % ) . Conclusions —Dalteparin 5000 IU once daily halved the rate of venous thromboembolism with a low risk of bleeding PURPOSE Although guidelines for venous thromboembolism prevention are available , the implementation of anticoagulant prophylaxis in patients with advanced cancer has yet to be more clearly defined . We aim to determine the incidence of lower extremity deep vein thrombosis ( DVT ) diagnosed by Doppler sonography ( USD ) in asymptomatic nonambulatory patients with advanced cancer . METHOD In a prospect i ve study , 44 nonambulatory cancer patients with grade 3 - 4 World Health Organization performance status , asymptomatic for lower extremity DVT , underwent bilateral venous USD studies of the lower extremities . Different risk factors and laboratory data were registered and correlated with the incidence of DVT . RESULT Asymptomatic DVT was detected in 15 of 44 patients ( 34 % , 95 % CI , 0.21 - 0.49 ) . Twenty-three percent of all patients had isolated deep calf vein thrombi and 11 % of all patients had thrombi in the proximal veins . The only significant risk factor was the number of metastatic sites . DVT was found in 4 of 23 ( 17.4 % ) patients with one metastatic site as opposed to 11 of 21 ( 52.3 % ) with two or more sites ( p < 0.01 ) . CONCLUSION USD of the lower extremities detected asymptomatic DVT in 34 % of advanced nonambulatory cancer patients and may serve as an additional decision-making tool in the consideration of anticoagulant therapy for this specific population BACKGROUND Although thromboprophylaxis reduces the incidence of venous thromboembolism in acutely ill medical patients , an associated reduction in the rate of death from any cause has not been shown . METHODS We conducted a double-blind , placebo-controlled , r and omized trial to assess the effect of subcutaneous enoxaparin ( 40 mg daily ) as compared with placebo -- both administered for 10±4 days in patients who were wearing elastic stockings with graduated compression -- on the rate of death from any cause among hospitalized , acutely ill medical patients at participating sites in China , India , Korea , Malaysia , Mexico , the Philippines , and Tunisia . Inclusion criteria were an age of at least 40 years and hospitalization for acute decompensated heart failure , severe systemic infection with at least one risk factor for venous thromboembolism , or active cancer . The primary efficacy outcome was the rate of death from any cause at 30 days after r and omization . The primary safety outcome was the rate of major bleeding during and up to 48 hours after the treatment period . RESULTS A total of 8307 patients were r and omly assigned to receive enoxaparin plus elastic stockings with graduated compression ( 4171 patients ) or placebo plus elastic stockings with graduated compression ( 4136 patients ) and were included in the intention-to-treat population . The rate of death from any cause at day 30 was 4.9 % in the enoxaparin group as compared with 4.8 % in the placebo group ( risk ratio , 1.0 ; 95 % confidence interval [ CI ] , 0.8 to 1.2 ; P=0.83 ) . The rate of major bleeding was 0.4 % in the enoxaparin group and 0.3 % in the placebo group ( risk ratio , 1.4 ; 95 % CI , 0.7 to 3.1 ; P=0.35 ) . CONCLUSIONS The use of enoxaparin plus elastic stockings with graduated compression , as compared with elastic stockings with graduated compression alone , was not associated with a reduction in the rate of death from any cause among hospitalized , acutely ill medical patients . ( Funded by Sanofi ; LIFENOX Clinical Trials.gov number , NCT00622648 . ) PURPOSE Venous thromboembolism ( VTE ) has been associated with negative prognosis in cancer patients . Most series reporting on VTE have included different tumor types not differentiating between recurrent or primary disease . Data regarding the actual impact of VTE on primary advanced ovarian cancer ( AOC ) are limited . PATIENTS AND METHODS Between 1995 and 2002 , the Arbeitsgemeinschaft Gynaekologische Onkologie Ovarian Cancer Study group ( AGO-OVAR ) recruited 2,743 patients with AOC in three prospect ively r and omized trials on platinum paclitaxel-based chemotherapy after primary surgery . Pooled data analysis was performed to evaluate incidence , predictors , and prognostic impact of VTE in AOC . Survival curves were calculated for the VTE incidence . Univariate analysis and Cox regression analysis were performed to identify independent predictors of VTE and mortality . RESULTS Seventy-six VTE episodes were identified , which occurred during six to 11 cycles of adjuvant chemotherapy ; 50 % of them occurred within 2 months postoperatively . Multivariate analysis identified body mass index higher than 30 kg/m(2 ) and increasing age as independent predictors of VTE . International Federation of Gynecology and Obstetrics stage and surgical radicality did not affect incidence . Overall survival was significantly reduced in patients with VTE ( median , 29.8 v 36.2 months ; P = .03 ) . Multivariate analysis identified pulmonary embolism ( PE ) , but not deep vein thrombosis alone , to be of prognostic significance . In addition , VTE was not identified to significantly affect progression-free survival . CONCLUSION Patients with AOC have their highest VTE risk within the first 2 months after radical surgery . Only VTE complicated by symptomatic PE have been identified to have a negative impact on survival . Studies evaluating the role of prophylactic anticoagulation during this high risk postoperative period are warranted
11,088
19,020,970
The systematic review identified better HAART outcomes among former DU , those with less severe psychiatric conditions , those receiving opioid substitution therapy and /or psychosocial support . Patients initiating HAART with lower viral load and higher CD4 counts , and those without co-infections also had better treatment outcomes . Our findings suggest that HIV+ DU tend to be inappropriately assumed to be less adherent and unlikely to achieve desirable treatment outcomes , when compared to their non-DU cohort
null
null
11,089
19,549,700
Evidence supports the use of quantitative MCE as a non-invasive test for detection of CAD .
AIMS We conducted a meta- analysis to evaluate the accuracy of quantitative stress myocardial contrast echocardiography ( MCE ) in coronary artery disease ( CAD ) .
BACKGROUND Qualitative interpretation of myocardial contrast echocardiography ( MCE ) improves the accuracy of wall-motion analysis for assessment of coronary artery disease ( CAD ) . We examined the feasibility and accuracy of quantitative MCE for diagnosis of CAD . METHODS Dipyridamole/exercise stress MCE ( destruction-replenishment protocol with real-time imaging ) was performed in 90 patients undergoing quantitative coronary angiography , 48 of whom had significant ( > 50 % ) stenoses . MCE was repeated with exercise alone in 18 patients . Myocardial blood flow ( A*beta ) was obtained from blood volume ( A ) and time to refill ( beta ) . RESULTS Quantification of flow reserve was feasible in 88 % . The mean A*beta reserve in the anterior wall was significantly impaired for patients with left anterior descending coronary artery disease ( n = 28 ) compared with those with no disease ( 1.6 + /- 1.2 vs 4.0 + /- 2.5 , P < or = .001 ) . This reflected impaired beta reserve , with no difference in the A reserve . Applying a receiver operating characteristic curve derived cutoff of 2.0 for A*beta reserve , quantitative MCE was 76 % sensitive and 71 % specific for the diagnosis of significant left anterior descending coronary artery stenosis . Posterior circulation results were similar , with 78 % sensitivity and 59 % specificity for detection of posterior CAD . Overall , quantitative MCE was similarly sensitive to qualitative approach for diagnosis of CAD ( 88 % vs 93 % ) , but with lower specificity ( 52 % vs 65 % , P = .07 ) . In 18 patients restudied with pure exercise stress , the mean myocardial blood flow reserve was less than after combined stress ( 2.1 + /- 1.6 vs 3.7 + /- 1.9 , P = .01 ) . CONCLUSION Quantitative MCE is feasible for the diagnosis of CAD with dipyridamole/exercise stress . Dipyridamole prolongs postexercise hyperemia , augmenting the degree of hyperemia at the time of imaging Myocardial contrast echocardiography ( MCE ) is a new technique for assessing myocardial perfusion that uses intracoronary injections of microbubbles of air . Because these microbubbles have a mean diameter of 4.3 + /- 0.3 microns and an intravascular rheology similar to that of red blood cells ( RBCs ) , we hypothesized that their mean myocardial transit rates recorded on echocardiography would provide an estimation of regional myocardial blood flow in the in vivo beating heart . Accordingly , blood flow to the left anterior descending coronary artery ( LAD ) of 12 open-chest anesthetized dogs ( group I ) was adjusted to 4 to 6 flows ( total of 60 flows ) , and microbubbles and radiolabeled RBCs were injected into the LAD in a r and om order at each stage . The mean myocardial RBC transit rates were measured by fitting a gamma-variate function to time-activity plots generated by placing a miniature CsI2 probe over the anterior surface of the heart , and the mean myocardial microbubble transit rates were measured from time-intensity plots derived from off-line analysis of MCE images obtained during the injection of microbubbles . An excellent correlation was noted between flow ( measured with an extracorporeal electromagnetic flow probe ) and mean myocardial RBC transit rate ( y = 2.83 x 10(-3)x + 0.01 , r = .96 , SEE = 0.02 , P < .001 ) . A close correlation was also noted between mean RBC and microbubble myocardial transit rates ( y = 1.01x + 0.01 , r = .89 , SEE = 0.02 , P < .001 ) . Despite its theoretical advantages , a lagged normal density function did not provide a better fit to the MCE data than the gamma-variate function . ( ABSTRACT TRUNCATED AT 250 WORDS OBJECTIVES The purpose of this study was to compare the assessment of myocardial perfusion by myocardial parametric quantification ( MPQ ) with technetium-99 m sestamibi single-photon emission computed tomographic ( SPECT ) imaging in humans . BACKGROUND Accurate visual interpretation of myocardial contrast echocardiographic ( MCE ) images is qualitative and requires considerable experience . Current computer-assisted quantitative perfusion protocol s are tedious and lack spatial resolution . Myocardial parametric quantification is a novel method that quantifies , color encodes , and displays perfusion data as a set of myocardial parametric images according to the relative degree of perfusion . METHODS Forty-six consecutive patients underwent prospect i ve stress/rest technetium-99 m sestamibi gated-SPECT imaging and MCE using intravenous Optison or Definity . Apical two- and four-chamber cine loops at rest and after dipyridamole ( 0.56 mg/kg ) stress were acquired . For each patient , the following assessment s of myocardial perfusion were performed : 1 ) . visual cine-loop assessment ( VIS ) ; 2 ) . MPQ assessment ; and 3 ) . combined VIS + MPQ assessment . RESULTS The segmental rates of agreement for myocardial perfusion with SPECT were 83 % , 89 % , and 92 % ( kappa = 0.46 , 0.58 , and 0.68 ) for VIS , MPQ , and VIS + MPQ , respectively . Similar trends were seen for the classification of the presence or absence of a moderate to severe perfusion defect , with the agreement for VIS , MPQ , and VIS + MPQ being 92 % , 97 % , and 97 % , respectively . CONCLUSIONS Myocardial parametric quantification demonstrates good agreement with SPECT and incremental agreement with VIS . Analysis strategies that incorporate MPQ demonstrate better agreement with SPECT than visual analysis alone AIMS Parametric imaging of myocardial perfusion provides useful visual information for the diagnosis of coronary artery disease ( CAD ) . We developed a technique for automated detection of perfusion defects based on quantitative analysis of parametric perfusion images and vali date d it against coronary angiography . METHODS AND RESULTS Contrast-enhanced , apical 2- , 3- and 4-chamber images were obtained at rest and with dipyridamole in 34 patients with suspected CAD . Images were analyzed to generate parametric perfusion images of the st and ard contrast-replenishment model parameters A , beta and A.beta . Each parametric image was divided into six segments , and mean parameter value ( MPV ) was calculated for each segment . Segmental MPV ratio between stress and rest was defined as a flow reserve index ( FRI ) . Receiver operating characteristics ( ROC ) analysis was used in a Study group ( N=17 ) to optimize FRI threshold and the minimal number of abnormal segments per vascular territory ( LAD and non-LAD ) , required for automated detection of stress-induced perfusion defects . The optimized detection algorithm was then tested prospect ively in the remaining 17 patients ( Test group ) . LAD and non-LAD stenosis > 70 % was found in 19 and 17 patients , respectively . In the Study group , FRI threshold was : LAD=0.95 and non-LAD=0.68 , minimal number of abnormal segments was four and two , correspondingly . Sensitivity , specificity and accuracy in the Test group were : 75 % , 67 % and 71 % in the LAD , and 75 % , 75 % and 75 % in the non-LAD territories . CONCLUSION Automated quantitative analysis of contrast echocardiographic parametric perfusion images is feasible and may aid in the objective detection of CAD
11,090
26,052,181
A systematic review of the literature around these examples reveals that computational modeling is having an impact on empirical research in cognitive development ; however , this impact does not extend to neural and clinical research .
This paper examines the contributions of dynamic systems theory to the field of cognitive development , focusing on modeling using dynamic neural fields . A brief overview highlights the contributions of dynamic systems theory and the central concepts of dynamic field theory ( DFT ) .
This study investigated whether children 's spatial recall performance shows three separable characteristics : ( 1 ) biases away from symmetry axes ( geometric effects ) ; ( 2 ) systematic drift over delays ; and ( 3 ) biases toward the exemplar distribution experienced in the task ( experience-dependent effects ) . In Experiment 1 , the location of one target within each geometric category was varied . Children 's responses showed biases away from a midline axis that increased over delays . In Experiment 2 , multiple targets were placed within each category at the same locations used in Experiment 1 . After removing geometric effects , 6-year-olds'--but not 11-year-olds'--responses were biased toward the average remembered location over learning . In Experiment 3 , children responded to one target more frequently than the others . Both 6- and 11-year-olds showed biases toward the most frequent target over learning . These results provide a bridge between the performance of younger children and adults , demonstrating continuity in the processes that underlie spatial memory abilities across development The Corsi Block-Tapping task has been utilized as a measure of spatial memory in both clinical and research context s for several decades . Despite its wide application , the task has been employed with extraordinary variability in administration and scoring and in the composition of stimulus item sets . We have generated a set of test items containing quasi-r and omly derived block-tapping sequences . In another study , we investigated item difficulty as a function of path configuration and showed a decline in performance with increasing span capacity load . In the current cross-sectional study , we evaluated developmental differences in span capacity by measuring performances of school children from grade 1 ( M age = 7 years ) to grade 8 ( M age = 14 ) and a young adult sample ( M age = 21 years ) . Mean span capacity increased incrementally and linearly with age , and no gender difference was observed . The increase in performance with advancing age supports the notion that spatial immediate memory capacity increases with maturation throughout childhood . Comparisons indicated that the span capacity of eighth grade rs ( M = 6.9 ) was not statistically different from that of the young adults ( M = 7.1 ) , suggesting an upper developmental plateau for spatial span in early adolescence . This study provides a normative data base for this widely utilized measure of spatial memory . Some of the data contained in this paper were presented at the meeting of the International Neuropsychological Society , Honolulu , February 2003 Thelen and colleagues recently proposed a dynamic field theory ( DFT ) to capture the general processes that give rise to infants ' performance in the Piagetian A-not-B task . According to this theory , the same general processes should operate in noncanonical A-not-B-type tasks with children older than 12 months . Three predictions of the DFT were tested by examining 3-year-olds ' location memory errors in a task with a homogeneous task space . Children pointed to remembered locations after delays of 0 s to 10 s. The spatial layout of the possible targets and the frequency with which children moved to each target was varied . As predicted by the DFT , children 's responses showed a continuous spatial drift during delays toward a longer term memory of previously moved-to locations . Furthermore , these delay-dependent effects were reduced when children moved to an " A " location on successive trials , and were magnified on the first trial to a nearby " B " location . Thus , the DFT generalized to capture the performance of 3-year-old children in a new task . In contrast to predictions of the DFT , however , 3-year-olds ' responses were also biased toward the midline of the task space-an effect predicted by the category adjustment ( CA ) model . These data suggest that young children 's spatial memory responses are affected by delay- and experience-dependent processes as well as the geometric structure of the task space . Consequently , two current models of spatial memory-the DFT and the CA model-provide incomplete accounts of children 's location memory abilities Two experiments examined how information about what objects are influences memory for where objects are located . Seven- , 9- , and 11-year-old children and adults learned the locations of 20 objects marked by dots on the floor of a box . The objects belonged to 4 categories . In one condition , objects belonging to the same category were located in the same quadrant of the box . In another condition , objects and locations were r and omly paired . After learning , participants attempted to replace the objects without the aid of the dots . Children and adults placed the objects in the same quadrant closer together when they were related than when they were unrelated , indicating that object information led to systematic biases in location memory 25 infants were tested every 2 weeks on the AB Object Permanence Task devised by Piaget , from the age when they first reached for a hidden object until they were 12 months . The delay between hiding and retrieval necessary to produce the AB error increased continuously throughout this period at an average rate of 2 sec/month , from under 2 sec at 7 1/2 months to over 10 sec by 12 months . All children displayed the AB error repeatedly over the months of testing . Large between-children differences in delay needed for the AB error were found at each age . Girls tolerated longer delays than boys . The characteristic pattern to the AB error did not vary over age or sex . Range of delay producing the AB error in any child was small . Errors disappeared when delays were reduced by 2 - 3 sec , and reaching became r and om or severely perseverative when delays were increased 2 - 3 sec above the level producing AB error . AB provides an index of the ability to carry out an intention based on stored information despite a conflicting habitual tendency Recently , Smith , Thelen , and colleagues proposed a dynamic systems account of the Piagetian " A-not-B " error in which infants ' errors result from general processes that make goal -directed actions to remembered locations . Based on this account , the A-not-B error should be a general phenomenon , observable in different tasks and at different points in development . Smith , Thelen , et al. 's proposal was tested using an A-not-B version of a s and box task . During three training trials and three " A " trials , 2-year-olds watched as a toy was buried in a s and box at Location A. Following a 10-s delay , children search ed for the object . Across five experiments , children 's ( total N = 92 ) performance on the A trials was accurate . After the A trials , children watched as a toy was hidden at Location B , 8 to 10 inches from Location A. In all experiments , children 's search es after a 10-s delay were significantly biased in the direction of Location A. Furthermore , this bias toward Location A decreased with repeated trials to Location B , as well as when children completed fewer trials to Location A. Together , these data suggest that A-not-B-type errors are pervasive across tasks and development People use geometric cues to form spatial categories . This study investigated whether people also use the spatial distribution of exemplars . Adults pointed to remembered locations on a tabletop . In Experiment 1 , a target was placed in each geometric category , and the location of targets was varied . Adults ' responses were biased away from a midline category boundary toward geometric prototypes located at the centers of left and right categories . Experiment 2 showed that prototype effects were not influenced by cross-category interactions . In Experiment 3 , subsets of targets were positioned at different locations within each category . When prototype effects were removed , there was a bias toward the center of the exemplar distribution , suggesting that common categorization processes operate across spatial and object domains
11,091
28,915,851
Issues relating to consent were the most significant theme identified . Type of consent differed depending on the condition or intervention being studied . The country in which the research took place did not appear to influence the type of consent , apart from the USA where exception from consent appeared to be most commonly used . A wide range of terms were used to describe consent . Conclusions Consent was the main ethical consideration in published ambulance based research . A range of consent models were used ranging from informed consent to exception from consent ( waiver of consent ) .
Background We sought to underst and the main ethical considerations when conducting clinical trials in the prehospital ambulance based setting .
Introduction Emergency calls to ambulance services are frequent for older people who have fallen , but ambulance crews often leave patients at the scene without ongoing care . Evidence shows that when left at home with no further support older people often experience subsequent falls which result in injury and emergency-department attendances . SAFER 2 is an evaluation of a new clinical protocol which allows paramedics to assess and refer older people who have fallen , and do not need hospital care , to community-based falls services . In this protocol paper , we report methods and progress during trial implementation . SAFER 2 is recruiting patients through three ambulance services . A successful trial will provide robust evidence about the value of this new model of care , and enable ambulance services to use re sources efficiently . Design Pragmatic cluster r and omised trial . Methods and analysis We r and omly allocated 25 participating ambulance stations ( clusters ) in three services to intervention or control group . Intervention paramedics received training and clinical protocol s for assessing and referring older people who have fallen to community-based falls services when appropriate , while control paramedics deliver care as usual . Patients are eligible for the trial if they are aged 65 or over ; resident in a participating falls service catchment area ; and attended by a trial paramedic following an emergency call coded as a fall without priority symptoms . The principal outcome is the rate of further emergency contacts ( or death ) , for any cause and for falls . Secondary outcomes include further falls , health-related quality of life , ‘ fear of falling ’ , patient satisfaction reported by participants through postal question naires at 1 and 6 months , and quality and pathways of care at the index incident . We shall compare National Health Service ( NHS ) and patient/carer costs between intervention and control groups and estimate quality -adjusted life years ( QALYs ) gained from the intervention and thus incremental cost per QALY . We shall estimate wider system effects on key-performance indicators . We shall interview 60 intervention patients , and conduct focus groups with contributing NHS staff to explore their experiences of the assessment and referral service . We shall analyse quantitative trial data by ‘ treatment allocated ’ ; and qualitative data using content analysis . Ethics and dissemination The Research Ethics Committee for Wales gave ethical approval and each participating centre gave NHS Research and Development approval . We shall disseminate study findings through peer- review ed publications and conference presentations . Trial Registration : IS RCT N AIMS To investigate the feasibility of delivering titrated oxygen therapy to adults with return of spontaneous circulation ( ROSC ) following out-of-hospital cardiac arrest ( OHCA ) caused by ventricular fibrillation ( VF ) or ventricular tachycardia ( VT ) . METHODS We used a multicentre , r and omised , single blind , parallel groups design to compare titrated and st and ard oxygen therapy in adults resuscitated from VF/VT OHCA . The intervention commenced in the community following ROSC and was maintained in the emergency department and the Intensive Care Unit . The primary end point was the median oxygen saturation by pulse oximetry ( SpO2 ) in the pre-hospital period . RESULTS 159 OHCA patients were screened and 18 were r and omised . 17 participants were analysed : nine in the st and ard care group and eight in the titrated oxygen group . In the pre-hospital period , SpO2 measurements were lower in the titrated oxygen therapy group than the st and ard care group ( difference in medians 11.3 % ; 95 % CI 1.0 - 20.5 % ) . Low measured oxygen saturation ( SpO2<88 % ) occurred in 7/8 of patients in the titrated oxygen group and 3/9 of patients in the st and ard care group ( P=0.05 ) . Following hospital admission , good separation of oxygen exposure between the groups was achieved without a significant increase in hypoxia events . The trial was terminated because accumulated data led the Data Safety Monitoring Board and Management Committee to conclude that safe delivery of titrated oxygen therapy in the pre-hospital period was not feasible . CONCLUSIONS Titration of oxygen in the pre-hospital period following OHCA was not feasible ; it may be feasible to titrate oxygen safely after arrival in hospital AIM To determine whether in patients with an ambulance response time of > 5min who were in VF cardiac arrest , 3min of CPR before the first defibrillation was more effective than immediate defibrillation in improving survival to hospital discharge . METHODS This r and omised control trial was run by the South Australian Ambulance Service between 1 July , 2005 , and 31 July , 2007 . Patients in VF arrest were eligible for r and omisation . Exclusion criteria were : ( i ) < 18 years of age , ( ii ) traumatic arrest , ( iii ) paramedic witnessed arrest , ( iv ) advanced life support performed before arrival of paramedics and ( v ) not for resuscitation order or similar directive . The primary outcome was survival to hospital discharge with secondary outcomes being neurological status at discharge , the rate of return of spontaneous circulation ( ROSC ) and the time from first defibrillation to ROSC . RESULTS For all response times , no differences were observed between the immediate defibrillation group and the CPR first group in survival to hospital discharge ( 17.1 % [ 18/105 ] vs. 10.3 % [ 10/97 ] ; P=0.16 ) , the rate of ROSC ( 53.3 % [ 56/105 ] vs. 50.5 % [ 49/97 ] ; P=0.69 ) or the time from the first defibrillation to ROSC ( 12:37 vs. 11:19 ; P=0.49 ) . There were also no differences between the immediate defibrillation group and the CPR first group , for response times of < or = or > 5min : survival to hospital discharge ( 50.0 % [ 7/14 ] vs. 25.0 % [ 4/16 ] ; P=0.16 or 12.1 % [ 11/91 ] vs.7.4 % [ 6/81 ] ; P=0.31 , respectively ) and the rate of ROSC ( 71.4 % [ 10/14 ] vs. 75.0 % [ 12/16 ] ; P=0.83 or 50.5 % [ 46/91 ] vs. 45.7 % [ 37/81 ] ; P=0.54 , respectively ) . No differences were observed in the neurological status of those surviving to hospital discharge . CONCLUSION For patient in out-of-hospital VF cardiac arrest we found no evidence to support the use of 3min of CPR before the first defibrillation over the accepted practice of immediate defibrillation Background The utility of advanced prehospital interventions for severe blunt traumatic brain injury ( BTI ) remains controversial . Of all trauma patient subgroups it has been anticipated that this patient group would most benefit from advanced prehospital interventions as hypoxia and hypotension have been demonstrated to be associated with poor outcomes and these factors may be amenable to prehospital intervention . Supporting evidence is largely lacking however . In particular the efficacy of early anaesthesia/muscle relaxant assisted intubation has proved difficult to substantiate . Methods This article describes the design and protocol of the Head Injury Retrieval Trial ( HIRT ) which is a r and omised controlled single centre trial of physician prehospital care ( delivering advanced interventions such as rapid sequence intubation and blood transfusion ) in addition to paramedic care for severe blunt TBI compared with paramedic care alone . Results Primary endpoint is Glasgow Outcome Scale score at six months post injury . Issues with trial integrity result ing from drop ins from st and ard care to the treatment arm as the result of policy changes by the local ambulance system are discussed . Conclusion This r and omised controlled trial will contribute to the evaluation of the efficacy of advance prehospital interventions in severe blunt TBI.Trial Registration Clinical Trials.gov : Background Advanced prehospital interventions for severe brain injury remains controversial . No previous r and omised trial has been conducted to evaluate additional physician intervention compared with paramedic only care . Methods Participants in this prospect i ve , r and omised controlled trial were adult patients with blunt trauma with either a scene GCS score < 9 ( original definition ) , or GCS<13 and an Abbreviated Injury Scale score for the head region ≥3 ( modified definition ) . Patients were r and omised to either st and ard ground paramedic treatment or st and ard treatment plus a physician arriving by helicopter . Patients were evaluated by 30-day mortality and 6-month Glasgow Outcome Scale ( GOS ) scores . Due to high non-compliance rates , both intention-to-treat and as-treated analyses were preplanned . Results 375 patients met the original definition , of which 197 was allocated to physician care . Differences in the 6-month GOS scores were not significant on intention-to-treat analysis ( OR 1.11 , 95 % CI 0.74 to 1.66 , p=0.62 ) nor was the 30-day mortality ( OR 0.91 , 95 % CI 0.60 to 1.38 , p=0.66 ) . As-treated analysis showed a 16 % reduction in 30-day mortality in those receiving additional physician care ; 60/195 ( 29 % ) versus 81/180 ( 45 % ) , p<0.01 , Number needed to treat = 6 . 338 patients met the modified definition , of which 182 were allocated to physician care . The 6-month GOS scores were not significantly different on intention-to-treat analysis ( OR 1.14 , 95 % CI 0.73 to 1.75 , p=0.56 ) nor was the 30-day mortality ( OR 1.05 , 95 % CI 0.66 to 1.66 , p=0.84 ) . As-treated analyses were also not significantly different . Conclusions This trial suggests a potential mortality reduction in patients with blunt trauma with GCS<9 receiving additional physician care ( original definition only ) . Confirmatory studies which also address non-compliance issues are needed . Trial registration number NCT00112398 IMPORTANCE A strategy using mechanical chest compressions might improve the poor outcome in out-of-hospital cardiac arrest , but such a strategy has not been tested in large clinical trials . OBJECTIVE To determine whether administering mechanical chest compressions with defibrillation during ongoing compressions ( mechanical CPR ) , compared with manual cardiopulmonary resuscitation ( manual CPR ) , according to guidelines , would improve 4-hour survival . DESIGN , SETTING , AND PARTICIPANTS Multicenter r and omized clinical trial of 2589 patients with out-of-hospital cardiac arrest conducted between January 2008 and February 2013 in 4 Swedish , 1 British , and 1 Dutch ambulance services and their referring hospitals . Duration of follow-up was 6 months . INTERVENTIONS Patients were r and omized to receive either mechanical chest compressions ( LUCAS Chest Compression System , Physio-Control/Jolife AB ) combined with defibrillation during ongoing compressions ( n = 1300 ) or to manual CPR according to guidelines ( n = 1289 ) . MAIN OUTCOMES AND MEASURES Four-hour survival , with secondary end points of survival up to 6 months with good neurological outcome using the Cerebral Performance Category ( CPC ) score . A CPC score of 1 or 2 was classified as a good outcome . RESULTS Four-hour survival was achieved in 307 patients ( 23.6 % ) with mechanical CPR and 305 ( 23.7 % ) with manual CPR ( risk difference , -0.05 % ; 95 % CI , -3.3 % to 3.2 % ; P > .99 ) . Survival with a CPC score of 1 or 2 occurred in 98 ( 7.5 % ) vs 82 ( 6.4 % ) ( risk difference , 1.18 % ; 95 % CI , -0.78 % to 3.1 % ) at intensive care unit discharge , in 108 ( 8.3 % ) vs 100 ( 7.8 % ) ( risk difference , 0.55 % ; 95 % CI , -1.5 % to 2.6 % ) at hospital discharge , in 105 ( 8.1 % ) vs 94 ( 7.3 % ) ( risk difference , 0.78 % ; 95 % CI , -1.3 % to 2.8 % ) at 1 month , and in 110 ( 8.5 % ) vs 98 ( 7.6 % ) ( risk difference , 0.86 % ; 95 % CI , -1.2 % to 3.0 % ) at 6 months with mechanical CPR and manual CPR , respectively . Among patients surviving at 6 months , 99 % in the mechanical CPR group and 94 % in the manual CPR group had CPC scores of 1 or 2 . CONCLUSIONS AND RELEVANCE Among adults with out-of-hospital cardiac arrest , there was no significant difference in 4-hour survival between patients treated with the mechanical CPR algorithm or those treated with guideline -adherent manual CPR . The vast majority of survivors in both groups had good neurological outcomes by 6 months . In clinical practice , mechanical CPR using the presented algorithm did not result in improved effectiveness compared with manual CPR . TRIAL REGISTRATION clinical trials.gov Identifier : NCT00609778 Background There is little in-depth research into how patients feel about emergency medical trials , and what influences these feelings . Objectives To investigate patients ’ feelings on taking part in emergency medical research , particularly trials conducted without prospect i ve consent . Methods Seventeen in patients , all recently admitted with a medical emergency , were interviewed . Questions focused on feelings on taking part in hypothetical trials , particularly trials conducted with deferred consent . Results Five main themes were identified . Level of trust in the medical profession — high levels of trust tended to correlate with willingness to participate in trials . Previous bad healthcare experiences tended to diminish trust . Concerns for personal well-being— patients identified a conflict between aversion to unknown side effects and desire for access to newer and potentially better treatments . Some would be less inclined to participate in research if they were severely unwell , some more so . Altruism — many cited the importance of helping to advance medical knowledge and of ‘ giving back to the health service ’ . Concerns over autonomy — some felt that deferred consent was a violation of personal autonomy . Uncertainty — many patients seemed to struggle to underst and the more complex concepts discussed . Conclusions Patients are broadly trusting , and open to participating in emergency medical trials , but want to be kept as informed as possible throughout the process . Willingness may be improved by providing more complete explanations , although this may be limited by the complexity of relevant concepts . Good communication and improved public underst and ing of clinical trials would likely increase acceptance of emergency care research Background High blood pressure ( BP ) during acute stroke is associated with poorer stroke outcome . Trials of treatments to lower BP have not result ed in improved outcome , but this may be because treatment commenced too late . Emergency medical service staff ( paramedics ) are uniquely placed to administer early treatment ; however , experience of prehospital r and omised controlled trials ( RCTs ) is very limited . Methods We conducted a pilot RCT to determine the feasibility of a definitive prehospital BP-lowering RCT in acute stroke . Paramedics were trained to identify , consent and deliver a first dose of lisinopril or placebo to adults with suspected stroke and hypertension while responding to the emergency call . Further treatment continued in hospital . Study eligibility , recruitment rate , completeness of receipt of study medication and clinical data ( eg , BP ) were collected to inform the design of a definitive RCT . Results In 14 months , 14 participants ( median age=73 years , median National Institute of Health Stroke Scale=4 ) were recruited and received the prehospital dose of medication . Median time from stroke onset ( as assessed by paramedic ) to treatment was 70 min . Four participants completed 7 days of study treatment . Of ambulance transported suspected stroke patients , 1 % were both study eligible and attended by a PIL-FAST paramedic . Conclusions It is possible to conduct a paramedic initiated double-blind RCT of a treatment for acute stroke . However , to perform a definitive RCT in a reasonable timescale , a large number of trained paramedics across several ambulance services would be needed to recruit the number of patients likely to be required . Clinical trial registration http://www . clinical trials.gov . Unique identifier : NCT01066572 Background In Australia approximately 25 % of Emergency Department ( ED ) attendances are via ambulance . ED overcrowding in Australia , as in many countries , is common . Measures to reduce overcrowding include the provision of enhanced timely primary care in the community for appropriate low risk injury and illness . Therefore paramedic assessment and referral to a community home hospital service , in preference to transfer to ED , may confer clinical and cost benefit . Methods / Design A r and omised controlled trial . Consenting adult patients that call an ambulance and are assessed by paramedics as having an eligible low risk problem will be r and omised to referral to ED via ambulance transfer or referral to a rapid response service that will assess and treat the patient in their own residence . The primary outcome measure is requirement for unplanned medical attention ( in or out of hospital ) in the first 48 hours . Secondary outcomes will include a number of other clinical endpoints . A cost effectiveness analysis will be conducted . Discussion If this trial demonstrates clinical non-inferiority and cost savings associated with the primary assessment service , it will provide one means to safely address ED overcrowding . Trial Registration Australian and New Zeal and Clinical Trials Registry Number Summary Objective To evaluate the feasibility of a prehospital r and omized controlled trial comparing transcutaneous pacing ( TCP ) with dopamine for unstable bradycardia . Methods Unstable bradycardic patients who failed to respond to a fluid bolus and up to 3 mg atropine were enrolled . The intervention was dopamine or TCP with crossover to dopamine if TCP failed . The primary outcome was survival to discharge or 30 days . R and omization compliance , safety , follow-up rates , primary outcome , and sample size requirements were assessed . Results Of 383 patients with unstable bradycardia , 151 ( 39 % ) failed to respond to atropine or fluid and were eligible for enrolment and 82 ( 55 % ) were correctly enrolled . Fifty-five ( 36 % ) of eligible patients could not be enrolled for practical reasons ; 3 had advance directives , 32 met inclusion criteria on arrival at hospital and in 20 cases , paramedics chose not to enroll based on the circumstances of the case . The remaining 13 were missed cases ; 8 were missing r and omization envelopes and in 5 , the paramedic forgot . R and omization compliance was 95 % ( 78/82 ) . Forty-two ( 51 % ) patients were r and omized to TCP and seven of these crossed over to dopamine . Two cases were r and omized but did not receive the intervention ; either due to lack of time or loss of IV access . Three adverse events occurred in each group . Survival to discharge or 30 days in hospital was 70 % ( 28/40 ) and 69 % ( 29/42 ) in the dopamine and TCP groups , respectively with 100 % follow up . To detect a 10 % relative difference in 30 days survival between treatment arms , a sample size of 690 per group would be required . Conclusions It is feasible to conduct a prehospital r and omized controlled trial of TCP for unstable bradycardia and a definitive trial would require a multi-centre study BACKGROUND Mechanical chest compression devices have the potential to help maintain high- quality cardiopulmonary resuscitation ( CPR ) , but despite their increasing use , little evidence exists for their effectiveness . We aim ed to study whether the introduction of LUCAS-2 mechanical CPR into front-line emergency response vehicles would improve survival from out-of-hospital cardiac arrest . METHODS The pre-hospital r and omised assessment of a mechanical compression device in cardiac arrest ( PARAMEDIC ) trial was a pragmatic , cluster-r and omised open-label trial including adults with non-traumatic , out-of-hospital cardiac arrest from four UK Ambulance Services ( West Midl and s , North East Engl and , Wales , South Central ) . 91 urban and semi-urban ambulance stations were selected for participation . Clusters were ambulance service vehicles , which were r and omly assigned ( 1:2 ) to LUCAS-2 or manual CPR . Patients received LUCAS-2 mechanical chest compression or manual chest compressions according to the first trial vehicle to arrive on scene . The primary outcome was survival at 30 days following cardiac arrest and was analysed by intention to treat . Ambulance dispatch staff and those collecting the primary outcome were masked to treatment allocation . Masking of the ambulance staff who delivered the interventions and reported initial response to treatment was not possible . The study is registered with Current Controlled Trials , number IS RCT N08233942 . FINDINGS We enrolled 4471 eligible patients ( 1652 assigned to the LUCAS-2 group , 2819 assigned to the control group ) between April 15 , 2010 and June 10 , 2013 . 985 ( 60 % ) patients in the LUCAS-2 group received mechanical chest compression , and 11 ( < 1 % ) patients in the control group received LUCAS-2 . In the intention-to-treat analysis , 30 day survival was similar in the LUCAS-2 group ( 104 [ 6 % ] of 1652 patients ) and in the manual CPR group ( 193 [ 7 % ] of 2819 patients ; adjusted odds ratio [ OR ] 0·86 , 95 % CI 0·64 - 1·15 ) . No serious adverse events were noted . Seven clinical adverse events were reported in the LUCAS-2 group ( three patients with chest bruising , two with chest lacerations , and two with blood in mouth ) . 15 device incidents occurred during operational use . No adverse or serious adverse events were reported in the manual group . INTERPRETATION We noted no evidence of improvement in 30 day survival with LUCAS-2 compared with manual compressions . On the basis of ours and other recent r and omised trials , widespread adoption of mechanical CPR devices for routine use does not improve survival . FUNDING National Institute for Health Research HTA - 07/37/69 OBJECTIVE To compare integrated automated load distributing b and CPR ( iA-CPR ) with high- quality manual CPR ( M-CPR ) to determine equivalence , superiority , or inferiority in survival to hospital discharge . METHODS Between March 5 , 2009 and January 11 , 2011 a r and omized , unblinded , controlled group sequential trial of adult out-of-hospital cardiac arrests of presumed cardiac origin was conducted at three US and two European sites . After EMS providers initiated manual compressions patients were r and omized to receive either iA-CPR or M-CPR . Patient follow-up was until all patients were discharged alive or died . The primary outcome , survival to hospital discharge , was analyzed adjusting for covariates , ( age , witnessed arrest , initial cardiac rhythm , enrollment site ) and interim analyses . CPR quality and protocol adherence were monitored ( CPR fraction ) electronically throughout the trial . RESULTS Of 4753 r and omized patients , 522 ( 11.0 % ) met post enrollment exclusion criteria . Therefore , 2099 ( 49.6 % ) received iA-CPR and 2132 ( 50.4 % ) M-CPR . Sustained ROSC ( emergency department admittance ) , 24h survival and hospital discharge ( unknown for 12 cases ) for iA-CPR compared to M-CPR were 600 ( 28.6 % ) vs. 689 ( 32.3 % ) , 456 ( 21.8 % ) vs. 532 ( 25.0 % ) , 196 ( 9.4 % ) vs. 233 ( 11.0 % ) patients , respectively . The adjusted odds ratio of survival to hospital discharge for iA-CPR compared to M-CPR , was 1.06 ( 95 % CI 0.83 - 1.37 ) , meeting the criteria for equivalence . The 20 min CPR fraction was 80.4 % for iA-CPR and 80.2 % for M-CPR . CONCLUSION Compared to high- quality M-CPR , iA-CPR result ed in statistically equivalent survival to hospital discharge PURPOSE OF THE STUDY IV line insertion and drugs did not affect long-term survival in an out-of-hospital cardiac arrest ( OHCA ) r and omized clinical trial ( RCT ) . In a previous large registry study adrenaline was negatively associated with survival from OHCA . The present post hoc analysis on the RCT data compares outcomes for patients actually receiving adrenaline to those not receiving adrenaline . MATERIAL S AND METHODS Patients from a RCT performed May 2003 to April 2008 were included . Three patients from the original intention-to-treat analysis were excluded due to insufficient documentation of adrenaline administration . Quality of cardiopulmonary resuscitation ( CPR ) and clinical outcomes were compared . RESULTS Clinical characteristics were similar and CPR quality comparable and within guideline recommendations for 367 patients receiving adrenaline and 481 patients not receiving adrenaline . Odds ratio ( OR ) for being admitted to hospital , being discharged from hospital and surviving with favourable neurological outcome for the adrenaline vs. no-adrenaline group was 2.5 ( CI 1.9 , 3.4 ) , 0.5 ( CI 0.3 , 0.8 ) and 0.4 ( CI 0.2 , 0.7 ) , respectively . Ventricular fibrillation , response interval , witnessed arrest , gender , age and endotracheal intubation were confounders in multivariate logistic regression analysis . OR for survival for adrenaline vs. no-adrenaline adjusted for confounders was 0.52 ( 95 % CI : 0.29 , 0.92 ) . CONCLUSION Receiving adrenaline was associated with improved short-term survival , but decreased survival to hospital discharge and survival with favourable neurological outcome after OHCA . This post hoc survival analysis is in contrast to the previous intention-to-treat analysis of the same data , but agrees with previous non-r and omized registry data . This shows limitations of non-r and omized or non-intention-to-treat analyses BACKGROUND The role of routine supplemental oxygen for patients with uncomplicated acute myocardial infa rct ion ( AMI ) has recently been question ed . There is conflicting data on the possible effects of hyperoxia on ischemic myocardium . The few clinical trials examining the role of oxygen in AMI were performed prior to the modern approach of emergent reperfusion and advanced medical management . METHODS Air Verses Oxygen In myocarDial infa rct ion study ( AVOID Study ) is a prospect i ve , multi-centre , r and omized , controlled trial conducted by Ambulance Victoria and participating metropolitan Melbourne hospitals with primary percutaneous coronary intervention capabilities . The purpose of the study is to determine whether withholding routine supplemental oxygen therapy in patients with acute ST-elevation myocardial infa rct ion but without hypoxia prior to reperfusion decreases myocardial infa rct size . AVOID will enroll 490 patients , > 18 years of age with acute ST-elevation myocardial infa rct ion of less than 12 hours duration . CONCLUSIONS There is an urgent need for clinical trials examining the role of oxygen in AMI . AVOID will seek to clarify this important issue . Results from this study may have widespread implication s on the treatment of AMI and the use of oxygen in both the pre-hospital and hospital setting BACKGROUND In patients with ST-segment elevation myocardial infa rct ion ( STEMI ) triaged to primary percutaneous coronary intervention ( PCI ) , anticoagulation often is initiated in the ambulance during transfer to a PCI site . In this prehospital setting , bivalirudin has not been compared with st and ard-of-care anticoagulation . In addition , it has not been tested in conjunction with the newer P2Y12 inhibitors prasugrel or ticagrelor . DESIGN EUROMAX is a r and omized , international , prospect i ve , open-label ambulance trial comparing bivalirudin with st and ard-of-care anticoagulation with or without glycoprotein IIb/IIIa inhibitors in 2200 patients with STEMI and intended for primary percutaneous coronary intervention ( PCI ) , presenting either via ambulance or to centers where PCI is not performed . Patients will receive either bivalirudin given as a 0.75 mg/kg bolus followed immediately by a 1.75-mg/kg per hour infusion for ≥30 minutes prior to primary PCI and continued for ≥4 hours after the end of the procedure at the reduced dose of 0.25 mg/kg per hour , or heparins at guideline -recommended doses , with or without routine or bailout glycoprotein IIb/IIIa inhibitor treatment according to local practice . The primary end point is the composite incidence of death or non-coronary-artery-bypass-graft related protocol major bleeding at 30 days by intention to treat . CONCLUSION The EUROMAX trial will test whether bivalirudin started in the ambulance and continued for 4 hours after primary PCI improves clinical outcomes compared with guideline -recommended st and ard-of-care heparin-based regimens , and will also provide information on the combination of bivalirudin with prasugrel or ticagrelor BACKGROUND The best initial approach to advanced airway management during out of hospital cardiac arrest ( OHCA ) is unknown . The traditional role of tracheal intubation has been challenged by the introduction of supraglottic airway devices ( SGAs ) , but there is contradictory evidence from observational studies . We assessed the feasibility of a cluster-r and omized trial to compare the i-gel SGA vs the laryngeal mask airway supreme ( LMAS ) vs current practice during OHCA . METHODS We conducted a cluster-r and omized trial in a single ambulance service in Engl and , with individual paramedics as the unit of r and omization . Consenting paramedics were r and omized to use either the i-gel or the LMAS or usual practice for all patients with non-traumatic adult OHCA , that they attended over a 12-month period . The primary outcome was study feasibility , including paramedic and patient recruitment and protocol adherence . Secondary outcomes included survival to hospital discharge and 90 days . RESULTS Of the 535 paramedics approached , 184 consented and 171 attended study training . Each paramedic attended between 0 and 11 patients ( median 3 ; interquartile range 2 - 5 ) . We recruited 615 patients at a constant rate , although the LMAS arm was suspended in the final two months following three adverse incidents . The study protocol was adhered to in 80 % of patients . Patient characteristics were similar in the three study arms , and there were no differences in secondary outcomes . CONCLUSION We have shown that a prospect i ve trial of alternative airway management strategies in OHCA , cluster r and omized by paramedic , is feasible . CLINICAL TRIAL REGISTRATION Registered on the International St and ard R and omised Controlled Trial Registry ( IS RCT N 18528625 ) Objective : To evaluate the effects on temperature and outcome at hospital discharge of a pre-hospital rapid infusion of large volume , ice-cold intravenous Hartmann 's solution in patients with out-of-hospital cardiac arrest and an initial cardiac rhythm of asystole or pulseless electrical activity . Design : Prospect i ve , r and omized , controlled clinical trial . Setting : Pre-hospital emergency medical service and 12 critical care units in Melbourne , Australia . Patients : One hundred and sixty three patients who had been resuscitated from cardiac arrest with an initial cardiac rhythm of asystole or pulseless electrical activity . Interventions : Patients were r and omized to either pre-hospital cooling using a rapid infusion of up to two litres ice-cold Hartmann 's solution ( 82 patients ) or cooling after hospital admission ( 81 patients ) . The planned duration of therapeutic hypothermia ( 32 ° C–34 ° C ) in both groups was 24 hrs . Measurements and Main Results : Patients allocated to pre-hospital cooling received a median of 1500 ml of ice-cold fluid . This result ed in a mean decrease in core temperature of 1.4 ° C compared with 0.2 ° C in hospital cooled patients ( p < .001 ) . The time to therapeutic hypothermia ( < 34 ° C ) was 3.2 hrs in the pre-hospital cooled group compared with 4.8 hrs in the hospital cooled group ( p = .0328 ) . Both groups received a mean of 15 hrs cooling in the hospital and only 7 patients in each group were cooled for 24 hrs . Overall , there was no difference in outcomes at hospital discharge with favorable outcome ( discharge from hospital to home or rehabilitation ) in 10 of 82 ( 12 % ) in the pre-hospital cooled patients , compared with 7 of 81 ( 9 % ) in the hospital cooled patients ( p = .50 ) . In the patients with a cardiac cause of the arrest , 8 of 47 patients ( 17 % ) who received pre-hospital cooling had a favorable outcome at hospital discharge compared with 3 of 43 ( 7 % ) in the hospital cooled group ( p = .146 ) . Conclusions : In adults who have been resuscitated from out-of-hospital cardiac arrest with an initial cardiac rhythm of asystole or pulseless electrical activity , pre-hospital cooling using a rapid infusion of large-volume , ice cold intravenous Hartmann 's solution decreases core temperature at hospital arrival and decreases the time to therapeutic hypothermia . In patients with a cardiac cause of the arrest , this treatment may increase the rate of favorable outcome at hospital discharge . Further larger studies should evaluate the effects of pre-hospital cooling when the initial cardiac rhythm is asystole or pulseless electrical activity , particularly in patients with a cardiac cause of the arrest Background Clinical trials are required to strengthen the evidence base for prehospital care . This question naire study aim ed to explore paramedics ’ perceptions of prehospital research and barriers to conducting prehospital clinical trials . Methods A self-completed question naire was developed to explore paramedic perceptions and barriers to undertaking prehospital trials based upon a review of existing research and semistructured qualitative interviews with five paramedics . The question naire was distributed by ‘ research champions ’ to 300 paramedics at r and omly selected ambulance stations in Yorkshire . Results Responses were received from 96/300 participants ( 32 % ) . Interest in clinical trials was reported , but barriers were recognised , including perceptions of poor knowledge and limited use of evidence , that conducting research is not a paramedics ’ responsibility , limited support for involvement in trials , concerns about the practicalities of r and omisation and consent , and time pressures . No association was found between training route and perceived underst and ing of trials ( p=0.263 ) or feeling that involvement in trials was a professional responsibility ( p=0.838 ) . Previous involvement in prehospital research was not associated with opinions on importance of an evidence base ( p=0.934 ) or gaining consent ( p=0.329 ) . The number of years respondents had been practicing was not associated with opinions on personal experience versus scientific evidence ( p=0.582 ) or willingness to receive training for clinical trials ( p=0.111 ) . However , the low response rate limited the power of the study to detect potential associations . Conclusions Paramedics reported interest and underst and ing of research , but a number of practical and ethical barriers were recognised that need to be addressed if prehospital clinical trials are to increase
11,092
27,820,553
The results indicate that the use of human patient simulation manikins improved knowledge acquisition and critical thinking ; and enhanced students ' satisfaction with learning . There is lack of unequivocal evidence on the effectiveness of using high-fidelity human patient simulation manikins in the teaching of clinical reasoning skills to undergraduate nursing students . CONCLUSION Further research is required to ascertain the effectiveness of the use of human patient simulation manikins as an educational strategy to improve clinical reasoning skills of undergraduate nursing students . The importance of this research is underscored by the potential for patient outcomes to be improved by enhancing the clinical reasoning skills of undergraduate graduate nursing students and graduates . IMPLICATION S FOR EDUCATION This review presents evidence to suggest that using HPSMs significantly improve learning outcomes related to clinical reasoning namely : critical thinking ; clinical skill performance and knowledge acquisition . In addition , results indicate high student satisfaction with HPSMs simulation experience .
BACKGROUND Human patient simulation manikins are being used extensively both nationally ( in Australia ) and internationally in the education of health professionals . There is evidence suggesting that these types of technologies are effective in teaching psychomotor skills . Furthermore student satisfaction with simulation approaches is generally high . However , the extent to which human patient simulation manikins are effective in the teaching of clinical reasoning skills to undergraduate nursing students is less clear . OBJECTIVE The aim of this systematic review was to identify the best available evidence for the effectiveness of using whole-body high-fidelity human patient simulation manikins to teach clinical reasoning skills to undergraduate nursing students . Other outcome measures included critical thinking , student satisfaction , knowledge acquisition , confidence levels , and skill performance as assessed by methods such as objective structured clinical examinations , and question naires .
AIM The aim of this paper is to present the results of a study design ed to determine the effect of scenario-based simulation training on nursing students ' clinical skills and competence . BACKGROUND Using full-scale , realistic , medical simulation for training healthcare professionals is becoming more and more common . Access to this technology is easier than ever before with the opening of several simulation centres throughout the world and the availability on the market of more sophisticated and affordable patient simulators . However , there is little scientific evidence proving that such technology is better than more traditional techniques in the education of , for example , undergraduate nursing students . METHODS A pretest/post-test design was employed with volunteer undergraduate students ( n = 99 ) from second year Diploma of Higher Education in Nursing programme in United Kingdom using a 15-station Objective Structured Clinical Examination . Students were r and omly allocated to either a control or an experimental group . The experimental group , as well as following their normal curriculum , were exposed to simulation training . Subsequently , all students were re-tested and completed a question naire . The data were collected between 2001 and 2003 . RESULTS The control and experimental groups improved their performance on the second Objective Structured Clinical Examination . Mean test scores , respectively , increased by 7.18 and 14.18 percentage points . The difference between the means was statistically significant ( P < 0.001 ) . However , students ' perceptions of stress and confidence , measured on a 5-point Likert scale , was very similar between groups at 2.9 ( 1 , not stressful ; 5 , very stressful ) and 3.5 ( 1 , very confident ; 5 , not confident ) for the control group , and 3.0 and 3.4 for the experimental group . CONCLUSIONS Intermediate-fidelity simulation is a useful training technique . It enables small groups of students to practise in a safe and controlled environment how to react adequately in a critical patient care situation . This type of training is very valuable to equip students with a minimum of technical and non-technical skills before they use them in practice setting Nowadays simulation is taking an important place in training and education of healthcare professionals . The University of Hertfordshire is carrying out a study which aims to determine the effect of realistic scenario-based simulation on nursing students ' competence and confidence . This project is sponsored by the British Heart Foundation and takes place in the Hertfordshire Intensive Care and Emergency Simulation Centre ( HICESC ) , a simulated three adult beds Intensive Care Unit . The simulation platform used is a Laerdal SimMan Universal Patient Simulator . A unique and robust study design , and results of the study are presented in this article . Consecutive cohorts of students are being assessed and reassessed after six months using an Objective Structured Clinical Examination ( OSCE ) . Students are r and omly divided into a control and experimental group for the period intervening between the two examinations . The experimental group is exposed to simulation training while the other students follow their usual nursing courses . Comparison is made between the OSCE results of the two groups of students . The experimental group had a greater improvement in performance than the control group ( 13.43 % compared with 6.76 % ( p<0.05 ) ) . The results and feedback received from students and lecturers suggest that simulation training in nursing education is beneficial The project purpose was to determine whether measures of critical thinking show differences between three groups ( simulator , non-simulator , control ) of baccalaureate nursing students . The second purpose was to determine the moderating effect of students ' preferred learning style . All groups experienced a moderate to large effect size in critical thinking scores . The corrected model for the total scale gain score was statistically significant but not significant for learning style or group Background Despite the recent wave of interest being shown in high‐fidelity simulators , they do not represent a new concept in healthcare education . Simulators have been a part of clinical education since the 1950s . The growth of patient simulation as a core educational tool has been driven by a number of factors . Declining inpatient population s , concerns for patient safety and advances in learning theory are forcing healthcare educators to look for alternatives to the traditional clinical encounter for skill acquisition for students . Objective The aim of this review was to identify the best available evidence on the effectiveness of using simulated learning experiences in pre‐licensure health profession education . Inclusion criteria Types of studies : This review considered any experimental or quasi‐experimental studies that addressed the effectiveness of using simulated learning experiences in pre‐licensure health profession practice . In the absence of r and omised controlled trials , other research design s were considered for inclusion , such as , but not limited to : non‐r and omised controlled trials and before‐ and ‐after studies . Types of participants : This review included participants who were pre‐licensure practitioners in nursing , medicine , and rehabilitation therapy . Types of intervention(s)/phenomena of interest : Studies that evaluated the use of human physical anatomical models with or without computer support , including whole‐body or part‐body simulators were included . Types of outcome measures Student outcomes included knowledge acquisition , skill performance , learner satisfaction , critical thinking , self‐confidence and role identity . Search strategy Using a defined search and retrieval method , the following data bases were accessed for the period 1995‐2006 : Medline , CINAHL , Embase , PsycINFO , HealthSTAR , Cochrane Data base of Systematic Review s and ERIC . Method ological quality Each paper was assessed by two independent review ers for method ological quality prior to inclusion in the review using the st and ardised critical appraisal instruments for evidence of effectiveness , developed by the Joanna Briggs Institute . Disagreements were dealt with by consultations with a third review er . Data collection Information was extracted from each paper independently by two review ers using the st and ardised data extraction tool from the Joanna Briggs Institute . Disagreements were dealt with by consultation with a third review er . Data synthesis Due to the type of design s and quality of available studies , it was not possible to pool quantitative research study results in statistical meta‐ analysis . As statistical pooling was not possible , the findings are presented in descriptive narrative form . Results Twenty‐three studies were selected for inclusion in this review including partial task trainers and high‐fidelity human patient simulators . The results indicate that there is high learner satisfaction with using simulators to learn clinical skills . The studies demonstrated that human patient simulators which are used for teaching higher level skills , such as airway management , and physiological concepts are useful . While there are short‐term gains in knowledge and skill performance , it is evident that performance of skills over time after initial training decline . Conclusion At best , simulation can be used as an adjunct for clinical practice , not a replacement for everyday practice . Students enjoyed t he sessions and using the models purportedly makes learning easier . However , it remains unclear whether the skills learned through a simulation experience transfer into real‐world setting s. More research is needed to evaluate whether the skills acquired with this teaching methodology transfer to the practice setting such as the impact of simulation training on team function & NA ; The purpose of this study was to compare the effectiveness of an interactive , multimedia CD‐ROM with traditional methods of teaching the skill of performing a 12‐lead ECG . A r and omized pre/posttest experimental design was used . Seventy‐seven baccalaureate nursing students in a required , senior‐level critical‐care course at a large midwestern university were recruited for the study . Two teaching methods were compared . The traditional method included a self‐ study module , a brief lecture and demonstration by an instructor , and h and s‐on experience using a plastic manikin and a real 12‐lead ECG machine in the learning laboratory . The second method covered the same content using an interactive , multimedia CD‐ROM embedded with virtual reality and supplemented with a self‐ study module . There were no significant ( p < .05 ) baseline differences in pretest scores between the two groups and no significant differences by group in cognitive gains , student satisfaction with their learning method , or perception of self‐efficacy in performing the skill . Overall results indicated that both groups were satisfied with their instructional method and were similar in their ability to demonstrate the skill correctly on a live , simulated patient . This evaluation study is a beginning step to assess new and potentially more cost‐effective teaching methods and their effects on student learning outcomes and behaviors , including the transfer of skill acquisition via a computer simulation to a real patient As computer-assisted instruction ( CAI ) use has increased during the past few years , nurse educators have expressed concern regarding its effectiveness . The purpose of this quasiexperimental study was to determine if completion of a computerized simulation about a surgical patient increased baccalaureate nursing students ' self-efficacy about caring for surgical patients in the clinical environment . Such an association is desirable because increased levels of self-efficacy have been associated with increased motivation , goal - setting , and achievement . A nonprobability , convenience sample ( N = 23 ) of second-year baccalaureate nursing students was assigned r and omly to experimental and control groups . The self-efficacy of the experimental group was measured three times : on an initial pretest ; following the intervention of a computer simulation ; and after an 8-week clinical rotation . The self-efficacy of the control group was measured on an initial pretest and after the 8-week rotation . Higher pre clinical self-efficacy scores ( p<.01 ) of the experimental group support the use of CAI as an important aspect of clinical education . Implication s for nursing education , practice , and research are addressed . Computer-assisted instruction ( CAI ) has been available for several decades ; however , its use in nursing education has escalated during the past few years . This intensified interest is the result of a decrease in the cost of computer hardware and an increase in the availability of relevant educational software ( Wright , 1995 ) . Cutbacks to educational funding , along with the rising costs of clinical teaching , have motivated studies of this method of nursing education . The purpose of this pilot study was to determine if completion of a supplemental computerized simulation about a surgical patient increased baccalaureate nursing students ' self-efficacy about caring for surgical patients in the clinical area . The exploration of a link between CAI and self-efficacy focused on the processes of learning , thereby extending the literature The actual effect of the use of simulations on clinical decision making is inconclusive . This pilot study used a posttest design to determine the effect of a simulation strategy on the clinical decision-making process of midwifery students . Thirty-six graduate diploma students volunteered and were r and omly assigned to two groups , with the experimental group receiving two simulation sessions ( normal labor and physiological jaundice ) , and the control group receiving the two usual lectures . The main findings were that students who received the simulation strategy collected more clinical information , revisited collected clinical information less , made fewer formative inferences , reported higher confidence levels , and for the posttest normal labor simulation , reached a final decision more quickly . Such effects are reasonable for this type of intervention with the existent variability in each group . Further research with a larger sample size and more rigorous data collection strategies is required R and omised controlled trials are the best way to compare the effectiveness of different interventions . Only r and omised trials allow valid inferences of cause and effect . Only r and omised trials have the potential directly to affect patient care — occasionally as single trials but more often as the body of evidence from several trials , whether or not combined formally by meta- analysis . It is thus entirely reasonable to require higher st and ards for papers reporting r and omised trials than those describing other types of study . Like all studies , r and omised trials are open to bias if done badly.1 It is thus essential that r and omised trials are done well and reported adequately . Readers should not have to infer what was probably done , they should be told explicitly . Proper methodology should be used and be seen to have been used . Yet review s of published trials have consistently found major deficiencies in reporting,2 3 4 making the task
11,093
26,757,311
RESULTS Neurological , metabolic , and sedation-related cognitive effects were reported more systematic ally than affective , anticholinergic , autonomic , cutaneous , hormonal , miscellaneous , and nonsedative cognitive effects . The global impact of AEs on patient well-being was poorly assessed .
OBJECTIVE Adverse effects ( AEs ) of antipsychotic medication have important implication s for patients and prescribers in terms of well-being , treatment adherence , and quality of life . This review summarizes strategies for collecting and reporting AE data across a representative literature sample to ascertain their rigor and comprehensiveness .
The prevalence of sexual dysfunction in schizophrenia patients was investigated as part of this large ( n = 7655 ) , prospect i ve , international ( 27 countries ) study . Based on patient reports , sexual dysfunction affected approx . 50 % of patients and the prevalence of complaints varied significantly between regions ( p < 0.0001 ) . The prevalence of sexual dysfunction , as perceived by psychiatrists , also varied significantly across regions ( p < 0.0001 ) . Psychiatrists significantly underestimated the presence of impotence/sexual dysfunction ( p < 0.0001 ) and loss of libido ( p < 0.0001 ) , compared to reports from patients . The frequency of sexual dysfunction was significantly higher in patients who had been using prolactin-elevating antipsychotics prior to study entry , compared to those who had been treated with prolactin-sparing antipsychotics ( patient reports , p = 0.002 ; psychiatrist perception , p = 0.0004 ) . This study has shown that the prevalence of sexual dysfunction is high in both male and female patients with schizophrenia and frequently underestimated by psychiatrists . Regional variation is evident in both psychiatrist perceptions and patient reports of sexual dysfunction . Given the importance of sexual function to quality of life and treatment compliance , proactive assessment of sexual function is required to optimize schizophrenia management This study compares two methods for elicitation of treatment-emergent side effects . One is the open-ended general inquiry and the other is a specific inquiry that asks about a wide range of events thought to be treatment-related . The study goal was to determine the extent to which the specific inquiry method elicits clinical ly useful information over and above that elicited by the general inquiry method . The assessment instrument we used is SAFTEE , a structured interview schedule developed by the National Institute of Mental Health . We looked for differences between general and specific inquiry formats in terms of number of events elicited , type of event , severity , functional impairment , and clinician action taken . We found that both methods contributed to elicitation of events that , in the clinician 's opinion , required some change in management . However , events reported on the General Inquiry form were significantly more distressing , more often interfered with daily functioning , and elicited more extensive changes in clinical management . No medically serious events were elicited on the specific inquiry form alone . Based on these findings , and in view of the amount of time and effort required to administer and score it , we do not recommend the specific inquiry form of SAFTEE as a st and ard assessment tool for routine use in all clinical trials . We do consider it to be a useful method for comprehensive elicitation about treatment-emergent effects in targeted and specific research context s. We see the schedule as a comprehensive document or library of queries to be tailored to the needs of individual protocol CONTEXT R and omized trials with adequate sample size offer an opportunity to assess the safety of new medications in a controlled setting ; however , generalizable data on drug safety reporting are sparse . OBJECTIVE To scrutinize the completeness of safety reporting in r and omized trials . DESIGN , SETTING , AND PATIENTS Survey of safety reporting in 192 r and omized drug trials 7 diverse topics with sample sizes of at least 100 patients and at least 50 patients in a study arm ( N = 130074 patients ) . Trial reports were identified from comprehensive meta-analyses in 7 medical areas . MAIN OUTCOME MEASURES Adequate reporting of specific adverse effects and frequency and reasons for withdrawals due to toxic effects ; article space allocated to safety reporting and predictors of such reporting . RESULTS Severity of clinical adverse effects and laboratory-determined toxicity was adequately defined in only 39 % and 29 % of trial reports , respectively . Only 46 % of trials stated the frequency of specific reasons for discontinuation of study treatment due to toxicity . For these 3 parameters , there was significant heterogeneity in rates of adequate reporting across topics ( P = .003 , P<.001 , and P = .02 , respectively ) . Overall , the median space allocated to safety results was 0.3 page . A similar amount of space was devoted to contributor names and affiliations ( P = .16 ) . On average , the percentage of space devoted to safety in the results section was 9.3 % larger in trials involving dose comparisons than in those that did not ( P<.001 ) and 3.8 % smaller in trials reporting statistically significant results for efficacy outcomes ( P = .047 ) . CONCLUSIONS The quality and quantity of safety reporting vary across medical areas , study design s , and setting s but they are largely inadequate . Current st and ards for safety reporting in r and omized trials should be revised to address this inadequacy Background We would expect information on adverse drug reactions in r and omised clinical trials to be easily retrievable from specific search es of electronic data bases . However , complete retrieval of such information may not be straightforward , for two reasons . First , not all clinical drug trials provide data on the frequency of adverse effects . Secondly , not all electronic records of trials include terms in the abstract or indexing fields that enable us to select those with adverse effects data . We have determined how often automated search methods , using indexing terms and /or textwords in the title or abstract , would fail to retrieve trials with adverse effects data . Methods We used a sample set of 107 trials known to report frequencies of adverse drug effects , and measured the proportion that ( i ) were not assigned the appropriate adverse effects indexing terms in the electronic data bases , and ( ii ) did not contain identifiable adverse effects textwords in the title or abstract . Results Of the 81 trials with records on both MEDLINE and EMBASE , 25 were not indexed for adverse effects in either data base . Twenty-six trials were indexed in one data base but not the other . Only 66 of the 107 trials reporting adverse effects data mentioned this in the abstract or title of the paper . Simultaneous use of textword and indexing terms retrieved only 82/107 ( 77 % ) papers . Conclusions Specific search strategies based on adverse effects textwords and indexing terms will fail to identify nearly a quarter of trials that report on the rate of drug adverse effects We sought to identify differences in the description of adverse drug experiences in reports of r and omized clinical trials ( RCTs ) from the United States and Japan , using diclofenac and simvastatin as test drugs . Reports were identified in Medline ( Index Medicus 1966 - 1990 ) , EMBASE ( Excerpta Medica 1974 - 1990 ) , JAPICDOC ( 1979 - 1990 ) , and JOIS-III ( JMEDICINE 1980 - 1990 ) . In each search keywords describing study design were paired with the drugs ' generic names , chemical names , and development numbers . Twenty-seven U.S. reports ( 18 for diclofenac and 9 for simvastatin ) and 22 Japanese reports ( 17 for diclofenac and 5 for simvastatin ) identified in these four data bases were selected for review . For each paper we identified the relation of the article to the data ( preliminary , primary , and secondary reports , review s ) , the means of identifying adverse reactions , the principal outcomes of the trials , and a variety of descriptive measures relating to study design , authors hip , and elements of presentation . With few exceptions , Japanese reports were not indexed in English- language data bases , and studies from the United States were not carried out in the Japanese data bases . The Japanese literature consisted exclusively of primary reports of clinical trials , whereas the U.S. literature was dominated by review articles and secondary reports of data from trials not fully published elsewhere . Japanese reports contained more detail on adverse experiences but reported principally those attributed to the drugs by attending clinicians . U.S. reports by contrast offered little detail but tended to include all adverse experiences , whether or not clinical ly attributed to drugs . A preponderance of U.S. articles reported significant differences between drugs in safety or treatment efficacy , whereas only one third of the Japanese articles did so for the same agents . Reports from both countries offered few details of the methods used to gather information on adverse drug experiences , and as a result the reported absolute frequencies of such events are difficult to compare between trials or to generalize to other setting s. In conclusion , the reporting of adverse reactions in clinical trials is inadequate in both the United States and Japanese literature . The shortcomings are complementary in that reports of U.S. trials contain insufficient detail and Japanese reports do not interpret or synthesize experience . Clinical research into drug safety in both countries could be improved through the adoption of simple st and ards of clarity and consistency in the monitoring and reporting of drug adverse effects BACKGROUND Reports of clinical trials usually emphasize efficacy results , especially when results are statistically significant . Poor safety reporting can lead to misinterpretation and inadequate conclusions about the interventions assessed . Our aim was to describe the reporting of harm-related results from r and omized controlled trials ( RCTs ) . METHODS We search ed the MEDLINE data base for reports of RCTs published from January 1 , 2006 , through January 1 , 2007 , in 6 general medical journals with a high impact factor . Data were extracted by use of a st and ardized form to appraise the presentation of safety results in text and tables . RESULTS Adverse events were mentioned in 88.7 % of the 133 reports . No information on severe adverse events and withdrawal of patients owing to an adverse event was given in 27.1 % and 47.4 % of articles , respectively . Restrictions in the reporting of harm-related data were noted in 43 articles ( 32.3 % ) with a description of the most common adverse events only ( n = 17 ) , severe adverse events only ( n = 16 ) , statistically significant events only ( n = 5 ) , and a combination of restrictions ( n = 5 ) . The population considered for safety analysis was clearly reported in 65.6 % of articles . CONCLUSION Our review reveals important heterogeneity and variability in the reporting of harm-related results in publications of RCTs Objective : This r and omized double‐blind multicenter trial evaluated the effects of olanzapine vs. clozapine on subjective well‐being , quality of life ( QOL ) and clinical outcome Background : Despite much being written on the topic , there are few surveys investigating the prevalence of anticholinergic adverse effects of antipsychotic drugs . One study , however , used trial-derived data to calculate estimates . Objectives : To investigate the prevalence/incidence rates of anticholinergic effects as viewed from within relevant r and omized trials . Methods : Data were extracted from each relevant study included in Cochrane review s. Data were checked , extracted , and simple frequencies , and 95 % confidence intervals ( CIs ) were calculated . Results : Many trials in relevant review s reported no data on anticholinergic effects ( estimate 40,000 participants ) . However , data were extracted from 177 studies within 54 review s ( N = 27,328 participants ) . Most data are short-term ( < 12 weeks ) . For blurred vision , the newer generations of drugs have rates of between 10 % and 20 % ( eg , risperidone , n = 1460 , 6 r and omized controlled trials [ RCTs ] , 11.9 % prevalence ; CI , 10 - 14 ; olanzapine , n = 1584 ; 4 RCTs , 12.2 % prevalence ; CI , 11 - 14 ) . These estimates are similar to those of sulpiride ( n = 186 ; 2 RCTs , 12.4 % ; CI , 8 - 18 ) and chlorpromazine ( n = 294 ; 10 RCTs , 11.2 % ; CI , 8 - 15 ) , less than trifluoperazine ( n = 167 ; 8 RCTs , 31.1 % ; CI , 25 - 39 ) , but considerably more than perphenazine ( n = 410 ; 8 RCTs , 3.7 % ; CI , 2 - 6 ) . Data are presented on a range of anticholinergic effects across different periods . Conclusions : Anticholinergic symptoms are common adverse effects associated with the use of all antipsychotic drugs , and newer-generation drugs are not clearly distinguishable from many older compounds . Adverse effect data should be more accessible Recent research indicates that subjective well-being is a major determinant of medication compliance in schizophrenia . However , it is yet unresolved whether atypical neuroleptics differ regarding subjective side-effects . A self-report instrument has been constructed to evaluate ' subjective well-being under neuroleptics ' ( SWN ) . The primary aims of the present study were to develop a short form of the SWN and to investigate the extent to which the atypical antipsychotic improves the patient 's subjective well-being . The short form of the SWN was constructed following an item analysis based on data from 212 schizophrenic patients medicated with either typical or atypical antipsychotics . The short form of the SWN showed sufficient internal consistency and good construct validity . The SWN was only moderately correlated with positive and negative syndrome scale ( PANSS ) scores or changes in psychopathology ( r=-0.20 to -0.37 ) . SWN-ratings in patients receiving olanzapine were superior compared to those of patients medicated with either clozapine or risperidone on three of five domains of well-being . Clozapine reduced global psychiatric symptoms significantly more than risperidone . It is concluded that the assessment of subjective well-being under antipsychotic treatment provides an independent outcome measure which is relevant to compliance This study examined the relationship of the therapeutic alliance to the treatment course and outcome of 143 patients with nonchronic schizophrenia . Results showed that patients who formed good alliances with their therapists within the first 6 months of treatment were significantly more likely to remain in psychotherapy , comply with their prescribed medication regimens , and achieve better outcomes after 2 years , with less medication , than patients who did not . These results underscored the prognostic value of assessing the alliance and the need to identify factors that contribute to its development and maintenance with schizophrenic patients A modification of an earlier rating scale for extrapyramidal system disturbance is described , and evidence for the validity and reliability of the scale is presented . The usefulness of the scale in studies of neuroleptic drugs is discussed . By its application it is possible to quantify extrapyramidal side effects and to separate them into four principal factors
11,094
29,956,028
Conclusions The RAC was not found to be more effective or safer than LC for benign gallbladder diseases , which indicated that RAC is a developing procedure instead of replacing LC at once . Given the higher costs , the current evidence is in favor of LC in cholecystectomy
Background Robotic surgery , an emerging technology , has some potential advantages in many complicated endoscopic procedures compared with laparoscopic surgery . But robot-assisted cholecystectomy ( RAC ) is still a controversial issue on its comparative merit compared with conventional laparoscopic cholecystectomy ( LC ) . The aim of this study was to evaluate the safety and efficacy of RAC compared with LC for benign gallbladder disease .
BACKGROUND Single-incision surgery has gained in popularity , and the recent development of specialized robotic and laparoscopic instruments may remove some of the ergonomic and technical difficulties associated with this approach . However , questions of cost and efficiency remain . METHODS We prospect ively collected perioperative outcome and efficiency ( operative time , case volume ) data for our single-site robotic cholecystectomy cases and retrospectively review ed data for our single-incision laparoscopic cholecystectomy cases . RESULTS There were no differences in patient characteristics or perioperative outcomes between the robotic ( n = 20 ) and laparoscopic ( n = 10 ) groups ; operative times were equivalent ( 84.6 vs 85.5 min ; p = 0.8737 ) and blood loss and complications were minimal . There was a higher robotic case volume , with an average of two robotic cases ( range 1 - 4)/day vs one/day for laparoscopic cases ( range 1 - 1 ; p = 0.0306 ) . Streamlined instrument costs were essentially equivalent . CONCLUSIONS Robotic single-site cholecystectomy is a safe , cost-effective alternative to single-incision laparoscopic cholecystectomy in a robot-existing model Background and aims Laparoscopic surgery has become the treatment of choice for cholecystectomy . Many studies showed that while this approach benefits the patient , the surgeon faces such distinct disadvantages as a poor ergonomic situation and limited degrees of freedom with limited motion as a consequence . Robots have the potential to overcome these problems . To evaluate the efficiency and feasibility of robotically assisted surgery ( RAC ) , we design ed a prospect i ve study to compare it with st and ard laparoscopic cholecystectomy ( SLC ) . Material s and methods Between 2001 and 2003 , 26 patients underwent SLC and 20 patients underwent RAC using the ZEUS system . The feasibility , safety , and possible advantages were evaluated . To assess the efficacy , the total time in the operating room was divided into preoperative , operative , and postoperative time frames . Results For RAC in comparison with SLC , the preoperative phase including equipment setup was significantly longer . In the intraoperative phase , the cut-closure time and camera and trocar insertion times were significantly longer . It is interesting to note that the net dissection time for the cystic artery , duct , and the gall bladder was not different from SLC . Conclusions The study demonstrates the feasibility of robotically assisted cholecystectomy without system-specific morbidity . There is time loss in several phases of robotic surgery due to equipment setup and deinstallation and therefore , presents no benefit in using the robot in laparoscopic cholecystectomy Abstract Background Single-incision laparoscopic cholecystectomy evolved from the traditional multiport laparoscopic technique . Prior trials have demonstrated improved cosmesis with the single-incision technique . Robotic single-site surgery minimizes the technical difficulties associated with laparoscopic single-incision approach . This is the first prospect i ve , r and omized , controlled study comparing robotic single-site cholecystectomy ( RSSC ) and multiport laparoscopic cholecystectomy ( MPLC ) in terms of cosmesis and patient satisfaction . Methods Patients with symptomatic benign gallbladder disease were r and omized to RSSC or MPLC . Data included perioperative variables such as operative time , conversion and complications and cosmesis satisfaction , body image perception , quality of life using vali date d question naires , at postoperative visits of 2 , 6 weeks and 3 months . Results One hundred thirty-six patients were r and omized to RSSC ( N = 83 ) and MPLC ( N = 53 ) at 8 institutions . Both cohorts were dominated by higher enrollment of females ( RSSC = 78 % , MPLC = 92 % ) . The RSSC and MPLC cohorts were otherwise statistically matched . Operative time was longer for RSSC ( 61 min vs. 44 min , P < 0.0001 ) . There were no differences in complication rates . RSSC demonstrated a significant superiority in cosmesis satisfaction and body image perception ( P value < 0.05 at every follow-up ) . There was no statistically significant difference in patient-reported quality of life . Multivariate analysis of female patients demonstrated significantly higher preference for RSSC over MPLC in cosmesis satisfaction and body image perception with no difference seen in overall quality of life . Conclusions Results from this trial show that RSSC is associated with improved cosmesis satisfaction and body image perception without a difference in observed complication rate . The uncompromised safety and the improved cosmesis satisfaction and body image perception provided by RSSC for female patients support consideration of the robotic single-site approach . Clinical Trials.gov identifier NCT01932216 Background The aim of this study was to examine the advantages and risks of the Automated Endoscopic System for Optical Positioning ( AESOP ) 3000 robot system during uncomplicated laparoscopic cholecystectomies or laparoscopic hernioplasty . Methods In a r and omized study , we examined two groups of 120 patients each with the diagnosis cholecystolithiasis respectively the unilateral inguinal hernia . We worked with the AESOP 3000 , a robotic arm system that is voice-controlled by the surgeon . The subjective and objective comfort of the surgeon as well as the course and length of the operation were measured . Results The robot-assisted operations required significantly longer preparation and operation times . With regard to the necessary comm and s and manual camera corrections , the assistant group was favored . The same was true for the subjective evaluation of the surgical course by the surgeon . Conclusions Our study showed that the use of AESOP during laparoscopic cholecystectomy and hernioplasty is possible in 94 % of all cases . The surgeon must accept a definite loss of comfort as well as a certain loss of time against the advantage of saving on personnel Abstract Background R and omized studies could not demonstrate significant outcome benefit after single-incision laparoscopic cholecystectomy compared to classic four-port laparoscopic cholecystectomy ( CLC ) . The new robotic single-site platform might offer potential benefits on local inflammation and postoperative pain due to its technological advantages . This prospect i ve r and omized double-blind trial compared the short-term outcomes between single-incision robotic cholecystectomy ( SIRC ) and CLC . Methods Two groups of 30 eligible patients were r and omized for SIRC or CLC . During the first postoperative week , patients and study monitors were blinded to the type of procedure performed by four dressing tapes applied on the abdomen . Pain was assessed at 6 h and on day 1 , 7 and 30 after surgery , along with a 1–10 cosmetic score . Results No significant difference in postoperative pain occurred in the two groups at any time point nor for any of the abdominal sites . Nineteen ( 63 % ) SIRC patients reported early postoperative pain in extra-umbilical sites . Intraoperative complications which might influence postoperative pain , such as minor bleeding and bile spillage , were similar in both groups and no conversions occurred . The cosmetic score 1 month postoperatively was higher for SIRC ( p < 0.001 ) . Two SIRC patients had wound infection , one of which developed an incisional hernia . Conclusions SIRC does not offer any significant reduction of postoperative pain compared to CLC . SIRC patients unaware of their type of operation still report pain in extra-umbilical sites like after CLC . The cosmetic advantage of SIRC should be balanced against an increased risk of incisional hernias and higher costs . Trial registration numberACTRN12614000119695 ( http://www.anzctr.org.au ) Flaws in the design , conduct , analysis , and reporting of r and omised trials can cause the effect of an intervention to be underestimated or overestimated . The Cochrane Collaboration ’s tool for assessing risk of bias aims to make the process clearer and more Background : Laparoscopic surgery might be beneficial for the patient , but it imposes increased physical and mental strain on the surgeon . Robot-assisted laparoscopic surgery addresses some of the laparoscopic drawbacks and may potentially reduce mental strain . This could reduce the risk of surgeon 's fatigue , mishaps and strain-induced illnesses , which may eventually improve the safety of laparoscopic surgical procedures . Methods : To test this hypothesis , a r and omized study was performed , comparing both heart rate and heart rate variability ( HRV ) of the surgeon as a measure of total and mental strain , respectively , during conventional and robot-assisted laparoscopic cholecystectomy . Results : Both heart rate and HRV ( the low-frequency b and /high-frequency b and ratio ) were significantly decreased when using robotic assistance . Conclusions : These data suggest the use of the daVinci ® Surgical System leads to less physical and mental strain of the surgeon during surgery . However , assessing mental strain by means of HRV is cumbersome since there is no clear cutoff point or scale for maximum tolerated strain levels and its related effects on surgeon 's health Background Recent advances in robotic technology suggest that the utilization of the da Vinci Single-Site ™ platform for cholecystectomy is safe , feasible and results in a shorter learning curve compared to conventional single-incision laparoscopic cholecystectomy . Moreover , the robot-assisted technology has been shown to reduce the surgeon ’s stress load compared to st and ard single-incision laparoscopy in an experimental setup , suggesting an important advantage of the da Vinci platform . However , the above-mentioned observations are based solely on case series , case reports and experimental data , as high- quality clinical trials to demonstrate the benefits of the da Vinci Single-Site ™ cholecystectomy have not been performed to date . Methods This study addresses the question whether robot-assisted Single-Site ™ cholecystectomy provides significant benefits over single-incision laparoscopic cholecystectomy in terms of surgeon ’s stress load , while matching the st and ards of the conventional single-incision approach with regard to peri- and postoperative outcomes . It is design ed as a single centre , single-blinded r and omized controlled trial , which compares both surgical approaches with the primary endpoint surgeon ’s physical and mental stress load at the time of surgery . In addition , the study aims to assess secondary endpoints such as operating time , conversion rates , additional trocar placement , intra-operative blood loss , length of hospital stay , costs of procedure , health-related quality of life , cosmesis and complications . Patients as well as ward staff are blinded until the 1st postoperative year . Sample size calculation based on the results of a previously published experimental setup utilizing an estimated effect size of surgeon ’s comfort of 0.8 ( power of 0.8 , alpha-error level of 0.05 , error margin of 10–15 % ) result ed in a number of 30 r and omized patients per arm . Discussion The study is the first r and omized controlled trial that compares the da Vinci Single Site ™ platform to conventional laparoscopic approaches in cholecystectomy , one of the most frequently performed operations in general surgery . Trial registration This trial is registered at clinical trials.gov ( trial number : NCT02485392 ) . Registered February 19 , 2015 Objective : To compare safety and costs of robotic-assisted and laparoscopic cholecystectomy in patients with symptomatic cholecystolithiasis . Background : Technical benefits of robotic-assisted surgery are well documented . However , pressure is currently applied to decrease costs , leading to restriction of development , and implementation of new technologies . So far , no convincing data are available comparing outcome or costs between computer assisted and conventional laparoscopic cholecystectomy . Methods : A prospect i ve case-matched study was conducted on 50 consecutive patients , who underwent robotic-assisted cholecystectomy ( Da Vinci Robot , Intuitive Surgical ) between December 2004 and February 2006 . These patients were matched 1:1 to 50 patients with conventional laparoscopic cholecystectomy , according to age , gender , American Society of Anesthesiologists score , histology , and surgical experience . Endpoints were complications after surgery ( mean follow-up of 12.3 months [ SD 1.2 ] ) , conversion rates , operative time , and hospital costs ( Clinical Trial.gov ID : NCT00562900 ) . Results : No minor , but 1 major complication occurred in each group ( 2 % ) . No conversion to open surgery was needed in either group . Operation time ( skin-to-skin , 55 minutes vs. 50 minutes , P < 0.85 ) and hospital stay ( 2.6 days vs. 2.8 days ) were similar . Overall hospital costs were significantly higher for robotic-assisted cholecystectomy $ 7985.4 ( SD 1760.9 ) versus $ 6255.3 ( SD 1956.4 ) , P < 0.001 , with a raw difference of $ 1730.1(95 % CI 991.4–2468.7 ) and a difference adjusted for confounders of $ 1606.4 ( 95 % CI 1076.7–2136.2 ) . This difference was mainly related to the amortization and consumables of the robotic system . Conclusions : Robotic-assisted cholecystectomy is safe and , therefore , a valuable approach . Costs of robots , however , are high and do not justify the use of this technology considering the lack of benefits for patients . A reduction of acquisition and maintenance costs for the robotic system is a prerequisite for large-scale adoption and implementation Abstract Background Surgeons continually strive to improve technology and patient care . One remarkable demonstration of this is the development of laparoscopic surgery . Once this proved to be a safe and reliable surgical approach , robotics seemed a logical progression of surgical technology . The aim of this project was to evaluate the utility of robotics in the context of single-incision laparoscopic cholecystectomy ( SILC ) . Methods A retrospective review of a prospect ively maintained data base of robotic single-incision laparoscopic cholecystectomy ( RSILC ) and traditional SILC performed by a single surgeon at our institution from July 2010 to August 2013 was queried . All consecutive patients undergoing RSILC and SILC during this time period were included . Primary outcomes include conversion rate and operative time . Secondary outcomes include length of stay , duration of narcotic use , time to independent performance of daily activities and cost . Categorical variables were evaluated using Chi-square analysis and continuous variables using t test or Wilcoxon ’s rank test . Results Thirty-eight patients underwent RSILC and 44 underwent SILC . BMI was higher in the RSILC group , and the number of patients with prior abdominal surgeries was higher in the SILC group . Otherwise , demographics were similar between the two groups . There was no difference in conversion rate between RSILC and SILC ( 8 vs 11 % , p = 0.60 ) . Mean operative time for RSILC was significantly greater compared with SILC ( 98 vs 68 min , p < 0.0001 ) . RSILC was associated with a longer duration of narcotic use ( 2.3 vs 1.7 days , p = 0.0019 ) and time to independent performance of daily activities ( 4 vs 2.3 days , p < 0.0001 ) . Total cost is greater in RSILC ( $ 8961 vs $ 5379 , p < 0.0001 ) . Conclusion While RSILC can be safely performed , it is associated with longer operative times and greater cost BACKGROUND Minimally invasive techniques have become an integral part of general surgery with recent investigation into single-incision laparoscopic cholecystectomy ( SILC ) . This study presents the final 1-year results of a prospect i ve , r and omized , multicenter , single-blinded trial of SILC vs multiport cholecystectomy ( 4PLC ) . STUDY DESIGN Patients with biliary colic and documented gallstones or polyps or with biliary dyskinesia were r and omized to SILC vs 4PLC . Data measures included operative details , adverse events , and conversion to 4PLC or laparotomy . Patients were followed for 12 months . RESULTS Two hundred patients underwent r and omization to SILC ( n = 119 ) or 4PLC ( n = 81 ) . Enrollment ranged from 1 to 50 patients with 4 sites enrolling > 25 patients . Total adverse events were not significantly different between groups ( 36 % 4PLC vs 45 % SILC ; p = 0.24 ) , as were severe adverse events ( 4 % 4PLC vs 10 % SILC ; p = 0.11 ) . Incision-related adverse events were higher after SILC ( 11.7 % vs 4.9 % ; p = 0.13 ) , but all of these were listed as mild or moderate . Total hernia rates were 1.2 % ( 1 of 81 ) in 4PLC patients vs 8.4 % ( 10 of 119 ) in SILC patients ( p = 0.03 ) . At 1-year follow-up , cosmesis scores continued to favor SILC ( p < 0.0001 ) . CONCLUSIONS Results of this trial show SILC to be a safe and feasible procedure when compared with 4PLC , with similar total adverse events but with an identified significant increase in hernia formation . Cosmesis scoring and patient preference at 12 months continue to favor SILC , and more than half of the patients were willing to pay more for a single-site surgery over a st and ard laparoscopic procedure . Additional longer-term population -based studies are needed to clarify if this increased rate of hernia formation as compared with 4PLC will continue to hold true Background The aim of this study was to compare the outcomes of single-site robotic cholecystectomy with multi-port laparoscopic cholecystectomy within a high-volume tertiary health care center . Methods A retrospective analysis of prospect ively maintained data was conducted on patients undergoing single-site robotic cholecystectomy or multi-port laparoscopic cholecystectomy between October 2011 and July 2014 . A single surgeon performed all the surgeries included in the study . Results A total of 678 cholecystectomies were performed . Of these , 415 ( 61 % ) were single-site robotic cholecystectomies and 263 ( 39 % ) were multi-port laparoscopic cholecystectomies . Laparoscopic patients had a greater mean BMI ( 30.5 vs. 29.0 kg/m2 ; p = 0.008 ) , were more likely to have undergone prior abdominal surgery ( 83.3 vs. 41.4 % ; p < 0.001 ) and had a higher incidence of preexisting comorbidities ( 76.1 vs. 67.2 % ; p = 0.014 ) as compared to the robotic group . There was no statistical difference in the total operative time , rate of conversion to open procedure and mean length of follow-up between the two groups . The mean length of hospital stay was shorter for patients within the robotic group ( 1.9 vs. 2.4 days ; p = 0.012 ) . Single-site robotic cholecystectomy was associated with a higher rate of wound infection ( 3.9 vs. 1.1 % ; p = 0.037 ) and incisional hernia ( 6.5 vs. 1.9 % ; p = 0.006 ) . Conclusion Multi-port laparoscopic cholecystectomy should remain the gold st and ard therapy for gallbladder disease . Single-site robotic cholecystectomy is an effective alternative procedure for uncomplicated benign gallbladder disease in properly selected patients . This must be carefully balanced against a high rate of surgical site infection and incisional hernia , and patients should be informed of these risks Background Laparoscopic Roux-en-Y gastric bypass ( RYGB ) has become the procedure of choice for the treatment of morbid obesity . Recently , several reports have shown the potential advantages of the robotic approach , notably by reducing complications . The aim of this study is to report our long-term experience with robotic Roux-en-Y gastric bypass ( RYGB ) and to compare outcomes with the laparoscopic approach . Methods From January 2003 to September 2013 , 777 consecutive minimally invasive RYGB have been performed in our institution : 389 laparoscopically ( 50.1 % ) and 388 robotically ( 49.9 % ) . During the study period , all the data regarding these consecutive RYGB has been prospect ively collected in a dedicated data base . Results While longer in duration compared to laparoscopy ( + 30 min ; p = 0.0001 ) , the robotic approach had a lower conversion rate ( 0.8 vs. 4.9 % ; p = 0.0007 ) , and less complications ( 11.6 % vs. 16.7 % ; p = 0.05 ) , in particular , less gastrointestinal leaks ( 0.3 vs. 3.6 % ; p = 0.0009 ) . There were also less early reoperations ( 1 vs. 3.3 % ; p = 0.05 ) and a shorter hospital stay in the robotic group ( 6.2 vs. 10.4 days ; p = 0.0001 ) . There were no statistical differences between the early and the current robotic experience , except in operative time and hospital stay , which were shorter for the last 100 cases . Finally , the BMI loss was significantly higher in the laparoscopic group starting at the first post-operative year . Conclusions Robotic RYGB is not only safe and feasible , but also a valid option in comparison to laparoscopy . At the cost of a longer operative time , we observed better short-term outcomes with the robotic approach BACKGROUND The robotic surgical system overcomes many technological obstacles of conventional laparoscopic surgery , and possesses enormous clinical applied potential . The aim of this study was to compare the efficacy of Zeus robot-assisted laparoscopic cholecystectomy with conventional laparoscopic cholecystectomy . METHODS Forty patients undergoing elective cholecystectomy were r and omly divided into two groups . Patients in group A ( n=20 ) underwent Zeus robot-assisted laparoscopic cholecystectomy , and patients in group B ( n=20 ) received conventional laparoscopic cholecystectomy . The parameters on operative field , operative time , the number of actions , the rate of operative errors and minimal trauma were evaluated and compared between the two groups . RESULTS The number of clearing camera ( 1.1+/-1.0 times ) and the time of adjusting the operative field ( 2.2+/-0.7 minutes ) in group A were significantly less than those ( 4.5+/-1.5 times ) and ( 7.5+/-1.2 minutes ) in group B. The number of dissection actions ( 337+/-86 times ) and the rate of operative errors ( 10 % ) in group A were less than those ( 389+/-94 times ) , ( 25 % ) in group B. The total operation time ( 104.9+/-20.5 minutes ) and setup time ( 29.5+/-9.8 minutes ) in group A were significantly longer than those ( 78.6+/-17.1 minutes ) , ( 12.6+/-2.5 minutes ) in group B. Blood loss and postoperative hospitalization were similar . No postoperative complications occurred in both groups , and open cholecystectomy was performed in each group . CONCLUSIONS Zeus robot-assisted cholecystectomy inherits the benefits of minimally invasive surgery . The Zeus robotic surgical system is better than conventional laparoscopic technique in controlling the operative field and can be manipulated precisely and stably though it requires more operative time Background : The efficacy of conventional laparoscopic cholecystectomy ( CLC ) was compared with robot-assisted laparoscopic cholecystectomy ( RLC ) . Surgical trainees performed the LC to avoid the surgeon ’s experience bias . Methods : Two surgical trainees performed 10 CLCs and 10 RLCs at r and om with a Zeus-Aesop Surgical Robotic System . The primary efficacy parameters were the total time and the number of actions involved in the procedure . The secondary parameters were setup and dissection times , and the number of grasping and dissection actions . Surgical complications were evaluated . Results : For CLC and RLC , respectively , the total times were 95.4 ± 28 min and 123.5 ± 33.3 min and the total actions were 420 ± 176.3 and 363.5 ± 158.2 . For CLC , the times required for setup ( 21 ± 10.4 min ) and dissection ( 50.2 ± 17.7 min ) were less than for RLC ( 33.8 ± 11.3 min and 72 ± 24.3 min , respectively ) . The numbers of grasping and dissection actions were not significantly different : 41.4 ± 26.5 and 378 ± 173.7 , respectively , for CLC versus 48.9 ± 27 and 314.6 ± 141.9 , respectively , for RLC . Conclusion : Although feasible , RLC requires significantly more time than CLC because of slower performed actions Background : Instrument positioners can position and lock a laparoscopic instrument . This study uses time-action analysis to evaluate objective ly whether IPs can substitute for a surgical assistant efficiently and safely . Methods : In four hospitals , 78 laparoscopic cholecystectomies were r and omly assisted by a surgical assistant or an instrument positioner ( AESOP and PASSIST ) The efficiency and safety of laparoscopic cholecystectomies were analyzed with respect to time , number and type of actions , positioning accuracy , and peroperative complications . A question naire evaluated the difficulties for each operation and the comfort of instrument positioner use . Results : The PASSIST and AESOP were able to replace the surgical assistant during laparoscopic cholecystectomies without significantly changing either the efficiency or the safety of the operation . The question naire showed that the surgeons preferred to operate with an instrument positioner . Conclusion : This study assessed objective ly that instrument positioners can substitute for a surgical assistant efficiently and safely in elective laparoscopic cholecystectomies OBJECTIVE To analyze the preliminary experience with the new da Vinci single-site technology for cholecystectomy . HYPOTHESIS Single-incision laparoscopic cholecystectomy is technically challenging and a related learning curve clearly exists . A novel approved robotic single-port platform has recently been introduced . This technology may help overcome some of the limitations of manual single-incision surgery relating to triangulation of instruments , ergonomics , and surgical exposure . DESIGN A prospect i ve longitudinal observational study was conducted on 100 consecutive da Vinci single-site cholecystectomies . SETTING Five Italian centers of robotic general surgery . MAIN OUTCOME MEASURES Primary end points were feasibility without conversion and the absence of major complications . Operative times were analyzed to define the learning curve using a mixed regression model . A question naire collected the opinions of the surgeons involved in using the new technique . RESULTS Two patients underwent conversion . No major intraoperative complications occurred , but there were 12 minor incidents ( 7 ruptures of the gallbladder and 5 cases of minor bleeding from the gallbladder bed ) . Mean ( SD ) total operative time was 71 ( 19 ) minutes , with a mean ( SD ) console time of 32 ( 13 ) minutes . No significant reduction in the operative times was observed with the increasing of each surgeon 's experience . The technique was judged more complex than st and ard 4-port laparoscopy but easier than single-incision laparoscopy . CONCLUSIONS Da Vinci single-site cholecystectomy is an easy and safe procedure for expert robotic surgeons . It allows the quick overcoming of the learning curve typical of single-incision laparoscopic surgery and may potentially increase the safety of this approach HYPOTHESIS Robotic technology is the most advanced development of minimally invasive surgery , but there are still some unresolved issues concerning its use in a clinical setting . DESIGN The study describes the clinical experience of the Department of General Surgery , Misericordia Hospital , Grosseto , Italy , in robot-assisted surgery using the da Vinci Surgical System . RESULTS Between October 2000 and November 2002 , 193 patients underwent a minimally invasive robotic procedure ( 74 men and 119 women ; mean age , 55.9 years [ range , 16 - 91 years ] ) . A total of 207 robotic surgical operations , including abdominal , thoracic and vascular procedures , were performed ; 179 were single procedures , and 14 were double ( 2 operations on the same patient ) . There were 4 conversions to open surgery and 3 to conventional laparoscopy ( conversion rate , 3.6 % ; 7 of 193 patients ) . The perioperative morbidity rate was 9.3 % ( 18 of 193 patients ) , and 6 patients ( 3.1 % ) required a reoperation . The postoperative mortality rate was 1.5 % ( 3 of 193 patients ) . CONCLUSIONS Our preliminary experience at a large community hospital suggests that robotic surgery is feasible in a clinical setting . Its daily use is safe and easily managed , and it exp and s the applications of minimally invasive surgery . However , the best indications still have to be defined , and the cost-benefit ratio must be evaluated . This report could serve as a basis for a future prospect i ve , r and omized trial BACKGROUND This study estimates the number of laparoscopic cholecystectomies required until improvement ceases , assesses the magnitude of such improvement , and provides some insight into the mechanism by which it takes place . METHODS Data from 500 consecutive laparoscopic cholecystectomies were analyzed from a prospect i ve data base for number of short and long operations and operative time . RESULTS There was a 40 % decrease ( P < 0.05 ) in average operative time over the first 200 operations . Significant decrease in the number of longer operations , increase of shorter cases , and decrease in the range of operative time were noted . The major contributor was a marked shortening of longer cases , without much increased speed of shorter operations . CONCLUSIONS For laparoscopic cholecystectomy , improvement persists for about 200 operations , result ing in a 40 % reduction in operative time . The primary mechanism of improvement seems to be an ability to deal more effectively with difficult cases Comparative studies between robotic and laparoscopic cholecystectomy ( LC ) focus heavily on economic considerations under the assumption of comparable clinical outcomes . Advancement of the robotic technique and the further widespread use of this approach suggest a need for newer comparison studies . 676 ICG-aided robotic cholecystectomies ( ICG-aided RC ) performed at the University of Illinois at Chicago ( UIC ) Division of General , Minimally Invasive and Robotic Surgery were compiled retrospectively . Additionally , 289 LC were similarly obtained . Data were compared to the largest single institution LC data sets from within the US and abroad . Statistically significant variations were found between UIC-RC and UIC-LC in minor biliary injuries ( p = 0.049 ) , overall open conversion ( p ≤ 0.001 ) , open conversion in the acute setting ( p = 0.002 ) , and mean blood loss ( p < 0.001 ) . UIC-RC open conversions were also significantly lower than Greenville Health System LC ( p ≤ 0.001 ) . Additionally , UIC ICG-RC result ed in the lowest percentages of major biliary injuries ( 0 % ) and highest percentage of biliary anomalies identified ( 2.07 % ) . ICG-aided cholangiography and the technical advantages associated with the robotic platform may significantly decrease the rate of open conversion in both the acute and non-acute setting . The sample size discrepancy and the non-r and omized nature of our study do not allow for drawing definitive conclusions
11,095
29,953,696
CONCLUSIONS Non-vitamin K oral anticoagulants halve the risk of fatal ICH in patients with non-valvular atrial fibrillation compared with VKAs , whereas indirect comparisons indicate that NOAC-specific reversal agents may be associated with a lower case fatality rate in NOAC-related ICH
BACKGROUND AND PURPOSE Intracranial hemorrhage ( ICH ) is the most feared complication in patients treated with oral anticoagulants due to non-valvular atrial fibrillation . Non-vitamin K oral anticoagulants ( NOACs ) reduce the risk of ICH compared with vitamin K antagonists ( VKAs ) . We performed a systematic review and meta- analysis to evaluate the risk of fatal NOAC-related ICH compared with VKA-related ICH .
Background and Purpose — Intracranial hemorrhage is the most devastating complication of anticoagulation . Outcomes associated with different sites of intracranial bleeding occurring with warfarin versus dabigatran have not been defined . Methods — Analysis of 18 113 participants with atrial fibrillation in the R and omized Evaluation of Long-term anticoagulant therapY ( RE-LY ) trial assigned to adjusted-dose warfarin ( target international normalized ratio , 2–3 ) or dabigatran ( 150 mg or 110 mg , both twice daily ) . Results — During a mean of 2.0 years of follow-up , 154 intracranial hemorrhages occurred in 153 participants : 46 % intracerebral ( 49 % mortality ) , 45 % subdural ( 24 % mortality ) , and 8 % subarachnoid ( 31 % mortality ) . The rates of intracranial hemorrhage were 0.76 % , 0.31 % , and 0.23 % per year among those assigned to warfarin , dabigatran 150 mg , and dabigatran 110 mg , respectively ( P<0.001 for either dabigatran dose versus warfarin ) . Fewer fatal intracranial hemorrhages occurred among those assigned dabigatran 150 mg and 110 mg ( n=13 and n=11 , respectively ) versus warfarin ( n=32 ; P<0.01 for both ) . Fewer traumatic intracranial hemorrhages occurred among those assigned to dabigatran ( 11 patients with each dose ) compared with warfarin ( 24 patients ; P<0.05 for both dabigatran doses versus warfarin ) . Independent predictors of intracranial hemorrhage were assignment to warfarin ( relative risk , 2.9 ; P<0.001 ) , aspirin use ( relative risk , 1.6 ; P=0.01 ) , age ( relative risk , 1.1 per year ; P<0.001 ) , and previous stroke/transient ischemic attack ( relative risk , 1.8 ; P=0.001 ) . Conclusions — The clinical spectrum of intracranial hemorrhage was similar for patients given warfarin and dabigatran . Absolute rates at all sites and both fatal and traumatic intracranial hemorrhages were lower with dabigatran than with warfarin . Concomitant aspirin use was the most important modifiable independent risk factor for intracranial hemorrhage Background and Purpose — Patients with atrial fibrillation and previous ischemic stroke (IS)/transient ischemic attack ( TIA ) are at high risk of recurrent cerebrovascular events despite anticoagulation . In this prespecified subgroup analysis , we compared warfarin with edoxaban in patients with versus without previous IS/TIA . Methods — ENGAGE AF-TIMI 48 ( Effective Anticoagulation With Factor Xa Next Generation in Atrial Fibrillation-Thrombolysis in Myocardial Infa rct ion 48 ) was a double-blind trial of 21 105 patients with atrial fibrillation r and omized to warfarin ( international normalized ratio , 2.0–3.0 ; median time-in-therapeutic range , 68.4 % ) versus once-daily edoxaban ( higher-dose edoxaban regimen [ HDER ] , 60/30 mg ; lower-dose edoxaban regimen , 30/15 mg ) with 2.8-year median follow-up . Primary end points included all stroke/systemic embolic events ( efficacy ) and major bleeding ( safety ) . Because only HDER is approved , we focused on the comparison of HDER versus warfarin . Results — Of 5973 ( 28.3 % ) patients with previous IS/TIA , 67 % had CHADS2 ( congestive heart failure , hypertension , age , diabetes , prior stroke/transient ischemic attack ) > 3 and 36 % were ≥75 years . Compared with 15 132 without previous IS/TIA , patients with previous IS/TIA were at higher risk of both thromboembolism and bleeding ( stroke/systemic embolic events 2.83 % versus 1.42 % per year ; P<0.001 ; major bleeding 3.03 % versus 2.64 % per year ; P<0.001 ; intracranial hemorrhage , 0.70 % versus 0.40 % per year ; P<0.001 ) . Among patients with previous IS/TIA , annualized intracranial hemorrhage rates were lower with HDER than with warfarin ( 0.62 % versus 1.09 % ; absolute risk difference , 47 [ 8–85 ] per 10 000 patient-years ; hazard ratio , 0.57 ; 95 % confidence interval , 0.36–0.92 ; P=0.02 ) . No treatment subgroup interactions were found for primary efficacy ( P=0.86 ) or for intracranial hemorrhage ( P=0.28 ) . Conclusions — Patients with atrial fibrillation with previous IS/TIA are at high risk of recurrent thromboembolism and bleeding . HDER is at least as effective and is safer than warfarin , regardless of the presence or the absence of previous IS or TIA . Clinical Trial Registration — URL : http://www . clinical trials.gov . Unique identifier : NCT00781391 BACKGROUND Specific reversal agents for non-vitamin K antagonist oral anticoagulants are lacking . Idarucizumab , an antibody fragment , was developed to reverse the anticoagulant effects of dabigatran . METHODS We undertook this prospect i ve cohort study to determine the safety of 5 g of intravenous idarucizumab and its capacity to reverse the anticoagulant effects of dabigatran in patients who had serious bleeding ( group A ) or required an urgent procedure ( group B ) . The primary end point was the maximum percentage reversal of the anticoagulant effect of dabigatran within 4 hours after the administration of idarucizumab , on the basis of the determination at a central laboratory of the dilute thrombin time or ecarin clotting time . A key secondary end point was the restoration of hemostasis . RESULTS This interim analysis included 90 patients who received idarucizumab ( 51 patients in group A and 39 in group B ) . Among 68 patients with an elevated dilute thrombin time and 81 with an elevated ecarin clotting time at baseline , the median maximum percentage reversal was 100 % ( 95 % confidence interval , 100 to 100 ) . Idarucizumab normalized the test results in 88 to 98 % of the patients , an effect that was evident within minutes . Concentrations of unbound dabigatran remained below 20 ng per milliliter at 24 hours in 79 % of the patients . Among 35 patients in group A who could be assessed , hemostasis , as determined by local investigators , was restored at a median of 11.4 hours . Among 36 patients in group B who underwent a procedure , normal intraoperative hemostasis was reported in 33 , and mildly or moderately abnormal hemostasis was reported in 2 patients and 1 patient , respectively . One thrombotic event occurred within 72 hours after idarucizumab administration in a patient in whom anticoagulants had not been reinitiated . CONCLUSIONS Idarucizumab completely reversed the anticoagulant effect of dabigatran within minutes . ( Funded by Boehringer Ingelheim ; RE-VERSE AD Clinical Trials.gov number , NCT02104947 . ) Background Idarucizumab , a monoclonal antibody fragment , was developed to reverse the anticoagulant effect of dabigatran . Methods We performed a multicenter , prospect i ve , open‐label study to determine whether 5 g of intravenous idarucizumab would be able to reverse the anticoagulant effect of dabigatran in patients who had uncontrolled bleeding ( group A ) or were about to undergo an urgent procedure ( group B ) . The primary end point was the maximum percentage reversal of the anticoagulant effect of dabigatran within 4 hours after the administration of idarucizumab , on the basis of the diluted thrombin time or ecarin clotting time . Secondary end points included the restoration of hemostasis and safety measures . Results A total of 503 patients were enrolled : 301 in group A , and 202 in group B. The median maximum percentage reversal of dabigatran was 100 % ( 95 % confidence interval , 100 to 100 ) , on the basis of either the diluted thrombin time or the ecarin clotting time . In group A , 137 patients ( 45.5 % ) presented with gastrointestinal bleeding and 98 ( 32.6 % ) presented with intracranial hemorrhage ; among the patients who could be assessed , the median time to the cessation of bleeding was 2.5 hours . In group B , the median time to the initiation of the intended procedure was 1.6 hours ; periprocedural hemostasis was assessed as normal in 93.4 % of the patients , mildly abnormal in 5.1 % , and moderately abnormal in 1.5 % . At 90 days , thrombotic events had occurred in 6.3 % of the patients in group A and in 7.4 % in group B , and the mortality rate was 18.8 % and 18.9 % , respectively . There were no serious adverse safety signals . Conclusions In emergency situations , idarucizumab rapidly , durably , and safely reversed the anticoagulant effect of dabigatran . ( Funded by Boehringer Ingelheim ; RE‐VERSE AD Clinical Trials.gov number , NCT02104947 .
11,096
30,954,199
The findings of these studies suggest that early ambulation and modified positioning were effective to reduce back pain in patients undergoing coronary angiography . The use of early ambulation 2 - 4 hours after angiography and changing the patients ' position along with modified positioning cause a reduction in the back pain of the patients
Coronary angiography is a gold st and ard tool for diagnosis of coronary artery disease . After this test , patients are restricted in bed to prevent vascular complications . Immobilization and bed rest can cause back pain in these patients . The objective of this rapid systematic review is to assess the efficacy of interventions for reducing back pain after transfemoral coronary angiography .
Cumulative exposure to socioeconomic disadvantage across the life course may be inversely associated with coronary heart disease ( CHD ) ; the mechanisms are not fully clear . An objective of this study was to determine whether cumulative life-course socioeconomic position ( SEP ) is associated with CHD incidence in a well-characterized US cohort that had directly assessed childhood and adulthood measures of SEP and prospect ively measured CHD incidence . Furthermore , analyses aim ed to evaluate whether adjustment for CHD risk factors reduces the association between cumulative life-course SEP and CHD . The authors examined 1,835 subjects who participated in the Framingham Heart Study Offspring Cohort from 1971 through 2003 ( mean age , 35.0 years ; 52.4 % women ) . Childhood SEP was measured as father 's education ; adulthood SEP was assessed as own education and occupation . CHD incidence included myocardial infa rct ion , coronary insufficiency , and coronary death . Cox proportional hazards analyses indicated that cumulative SEP was associated with incident CHD after adjustment for age and sex ( hazard ratio = 1.82 , 95 % confidence interval : 1.17 , 2.85 for low vs. high cumulative SEP score ) . Adjustment for CHD risk factors reduced that magnitude of association ( hazard ratio = 1.29 , 95 % confidence interval : 0.78 , 2.13 ) . These findings underscore the potential importance of CHD prevention and treatment efforts for those whose background s include low SEP throughout life Background : The optimal length of bed rest after femoral coronary angiography is still unknown . Short immobilisation could cause puncture site complications due to the modern antiplatelet therapy used , while long immobilisation time increases the risk of back pain for the patient . Purpose : To assess the safety , as well as perceived comfort , of early mobilisation after coronary angiography with femoral approach . Methods : A r and omised , single centre pilot trial with 104 coronary angiography patients ( including 58 patients with non ST-elevation acute coronary syndrome ) assigned to a post-procedural bed rest time for either 1.5 or 5 h. The primary endpoint was any incidence of vascular complication . Patients ' discomfort was measured as self-perceived grade of pain in the back . Results : The presence of haematomas ≥ 5 cm was 5.8 % in the short immobilisation group vs. 3.8 % in the control group ( ns ) . There was a significantly lower rate of perceived back pain in the short immobilisation group , compared to the controls , at the time of mobilisation , which remained significant also after 2 h of mobilisation . Conclusion : Early ambulation after coronary angiography is safe , without affecting the incidence of vascular complications , and decreases the patients ' pain , both during and after the bed rest Background : Vascular access complications may be a cause of discomfort , prolonged hospital stay , and impaired outcomes in patients undergoing cardiac catheterisation . Aims : To assess vascular access complication in our patients with/without the use of closure devices as a first local benchmark for subsequent quality improvement . Methods : A nurse-led single-centre prospect i ve survey of all vascular access complications in consecutive patients su bmi tted to cardiac catheterisation during 4 months . Results : The radial and femoral access were used in 78 ( 14 % ) and 470 ( 83 % ) , respectively , of 564 procedures , and a closure device was used in 136 of the latter . A haemathoma ( any size ) was isolated and uneventful in 9.6 % of cases . More severe complications ( haemoglobin loss > 2 g , need for blood transfusion or vascular repair ) occurred in 1.2 % of cases , namely : in none of the procedures with radial access , and in 0.4 % and 2.4 % of femoral diagnostic and interventional coronary procedures , respectively . During complicated ( n = 40 ) vs uncomplicated ( n = 172 ) transfemoral interventions , the activated coagulation time was 309 ± 83 vs 271 ± 71 s ( p = 0.004 ) , but the use of closure devices was similar . Conclusion : Severe vascular access complications in our patients were fewer than in most reports , and virtually absent in radial procedures . Vigorous anticoagulation was associated with increased complications in our patients , but closure devices were not . A new policy including both the use of the radial access whenever possible , and a less aggressive anticoagulation regimen during transfemoral interventions will be tested OBJECTIVE : To test the feasibility of creating a valid and reliable checklist with the following features : appropriate for assessing both r and omised and non-r and omised studies ; provision of both an overall score for study quality and a profile of scores not only for the quality of reporting , internal validity ( bias and confounding ) and power , but also for external validity . DESIGN : A pilot version was first developed , based on epidemiological principles , review s , and existing checklists for r and omised studies . Face and content validity were assessed by three experienced review ers and reliability was determined using two raters assessing 10 r and omised and 10 non-r and omised studies . Using different raters , the checklist was revised and tested for internal consistency ( Kuder-Richardson 20 ) , test-retest and inter-rater reliability ( Spearman correlation coefficient and sign rank test ; kappa statistics ) , criterion validity , and respondent burden . MAIN RESULTS : The performance of the checklist improved considerably after revision of a pilot version . The Quality Index had high internal consistency ( KR-20 : 0.89 ) as did the subscales apart from external validity ( KR-20 : 0.54 ) . Test-retest ( r 0.88 ) and inter-rater ( r 0.75 ) reliability of the Quality Index were good . Reliability of the subscales varied from good ( bias ) to poor ( external validity ) . The Quality Index correlated highly with an existing , established instrument for assessing r and omised studies ( r 0.90 ) . There was little difference between its performance with non-r and omised and with r and omised studies . Raters took about 20 minutes to assess each paper ( range 10 to 45 minutes ) . CONCLUSIONS : This study has shown that it is feasible to develop a checklist that can be used to assess the method ological quality not only of r and omised controlled trials but also non-r and omised studies . It has also shown that it is possible to produce a checklist that provides a profile of the paper , alerting review ers to its particular method ological strengths and weaknesses . Further work is required to improve the checklist and the training of raters in the assessment of external validity OBJECTIVE To examine the effects of ambulation at 3 versus 6 hours on delayed bleeding , pain , and anxiety in patients after cardiac angiogram . DESIGN Experimental , pretest posttest , r and om assignment . SETTING Western Canadian University-affiliated tertiary care hospital . PATIENTS Thirty-nine patients who underwent cardiac angiograms . OUTCOME MEASURES Delayed bleeding , pain , and anxiety . INTERVENTION The experimental group ambulated at 3 hours after cardiac angiogram ; the control group ambulated at 6 hours . Delayed bleeding was evaluated by sanguinous drainage through a st and ard gauze pressure dressing and /or the presence of a palpable hematoma greater than 5 cm in width . Melzack 's Present Pain Intensity Scale and Spielberger 's State Anxiety Inventory were used to evaluate patient comfort at 2 , 4 , and 7 hours after angiogram and the next day . RESULTS None of the patients experienced any delayed bleeding . Student 's t test was used to compare pain levels and anxiety scores . In addition , repeated measures analysis of variance was applied to pain scores taken at 4 hours , 7 hours , and the next day . The 2-hour observation data were used as a covariate and a basis for comparison of pain at the next three observations . Patients ambulating early had significantly less pain overall ( p < 0.005 ) and less back pain at 4 and 7 hours after angiogram ( p < 0.05 ) . There was no significant difference in the mean anxiety scores . CONCLUSION The significant decrease in back pain of patients who ambulated earlier demonstrates the need to consider patient comfort as well as the potential risks and sequelae of delayed bleeding INTRODUCTION After coronary angiography to prevent potential complications , patients are restricted to 4 - 24 hours bed rest in the supine position due to the complications . This study was design ed to assess the effect of changing position and early ambulation on low back pain , urinary retention , bleeding and hematoma after cardiac catheterization . METHODS In this clinical trial , 140 patients by using a convenience sampling r and omly divided into four 35-individual groups . The patients in the control group were in the supine position for 6 hours without a movement . Change position was applied to the second group ( based on a specific protocol ) , early ambulation was applied to the third group and both early ambulation and change position were applied to the fourth group . Then , severity of bleeding , hematoma , back pain and urinary retention were measured at zero , 1 , 2 , 4 , 6 , and 24 hours after angiography . The data was collected through an individual data question naire , Numerical Rating Scale ( NRS ) of pain and Kristin Swain 's check list was applied to evaluate the severity of bleeding and hematoma . RESULTS None of patients developed vascular complications . Incidence of urinary retention was higher in the control group , although this difference was not significant . The mean of pain intensity in the fourth and sixth hours showed a significant difference . CONCLUSION Based on the findings of this study , changing patients ' position can be safe and they can be ambulated early after angiography The general recommended strategy after arterial invasive procedures is a 4- to 6-hour bed rest that is associated with patient discomfort and increased medical costs . We hypothesized that mobilization of selected patients at the second hour would not increase vascular complications . Coronary angiography was performed through the femoral route via 6-Fr catheters . Homeostasis was achieved by manual compression and maintained with a compressive b and age . A total of 1446 patients were ambulated at the second hour and 1226 of them were discharged without complication . A total of 220 patients required further follow-up due to blood oozing ; 154 patients were conventionally ambulated due to difficult arterial access , longer ( > 15 minutes ) compression time , hematoma formation within 2 hours , or hypertensive state ( blood pressure > 180/100 mm Hg ) . Twenty-five ( 16 % ) of those patients developed minor bleeding after ambulation . No major bleeding or large hematoma was observed during in-hospital observation . Ecchymosis ( 10 % [ 2-hour group ] vs 21 % [ 4—5 hour group ] ) and small hematomas ( 22 % vs 9 % ) were the most frequent complications after discharge . Early mobilization of selected patients undergoing diagnostic heart catheterization through the femoral artery via 6-Fr catheters is safe and associated with acceptable bleeding complication rates AIM This paper is a report of a study to investigate the effect of three positioning protocol s on back pain , heart rate , blood pressure and vascular complications after cardiac catheterization . BACKGROUND After cardiac catheterization , bed rest is prescribed in order to minimize vascular complications , but this often leads to back pain and other complications , such as hemodynamic instability . METHODS A three-group quasi-experimental design was used in this study , which was conducted in 2006 . A convenience sample of 105 patients was r and omly assigned to either the control or the two experimental groups ( A and B ) . The control group received routine care . Group B was treated only with modified positioning and group A with modified positioning and a pillow under their body . Back pain , heart rate , arterial blood pressure , haematoma formation and bleeding were measured at regular time intervals . FINDINGS The control group experienced higher levels of pain after 3 , 6 , 8 hours and the morning after catheterization . The level of pain in group B was also higher than in group A at 3 hours after the procedure . Mean heart rate and blood pressure were lower in the experimental groups compared with the control group at 6 and 8 hours after catheterization . No statistically significant difference between the three groups regarding the amounts of overall bleeding and overall haematoma formation was observed . CONCLUSION Changing position in bed and using a supportive pillow during the early hours after cardiac catheterization can effectively minimize pain and hemodynamic instability without increasing vascular complications BACKGROUND Small trials have suggested that radial access for percutaneous coronary intervention ( PCI ) reduces vascular complications and bleeding compared with femoral access . We aim ed to assess whether radial access was superior to femoral access in patients with acute coronary syndromes ( ACS ) who were undergoing coronary angiography with possible intervention . METHODS The RadIal Vs femorAL access for coronary intervention ( RIVAL ) trial was a r and omised , parallel group , multicentre trial . Patients with ACS were r and omly assigned ( 1:1 ) by a 24 h computerised central automated voice response system to radial or femoral artery access . The primary outcome was a composite of death , myocardial infa rct ion , stroke , or non-coronary artery bypass graft (non-CABG)-related major bleeding at 30 days . Key secondary outcomes were death , myocardial infa rct ion , or stroke ; and non-CABG-related major bleeding at 30 days . A masked central committee adjudicated the primary outcome , components of the primary outcome , and stent thrombosis . All other outcomes were as reported by the investigators . Patients and investigators were not masked to treatment allocation . Analyses were by intention to treat . This trial is registered with Clinical Trials.gov , NCT01014273 . FINDINGS Between June 6 , 2006 , and Nov 3 , 2010 , 7021 patients were enrolled from 158 hospitals in 32 countries . 3507 patients were r and omly assigned to radial access and 3514 to femoral access . The primary outcome occurred in 128 ( 3·7 % ) of 3507 patients in the radial access group compared with 139 ( 4·0 % ) of 3514 in the femoral access group ( hazard ratio [ HR ] 0·92 , 95 % CI 0·72 - 1·17 ; p=0·50 ) . Of the six prespecified subgroups , there was a significant interaction for the primary outcome with benefit for radial access in highest tertile volume radial centres ( HR 0·49 , 95 % CI 0·28 - 0·87 ; p=0·015 ) and in patients with ST-segment elevation myocardial infa rct ion ( 0·60 , 0·38 - 0·94 ; p=0·026 ) . The rate of death , myocardial infa rct ion , or stroke at 30 days was 112 ( 3·2 % ) of 3507 patients in the radial group compared with 114 ( 3·2 % ) of 3514 in the femoral group ( HR 0·98 , 95 % CI 0·76 - 1·28 ; p=0·90 ) . The rate of non-CABG-related major bleeding at 30 days was 24 ( 0·7 % ) of 3507 patients in the radial group compared with 33 ( 0·9 % ) of 3514 patients in the femoral group ( HR 0·73 , 95 % CI 0·43 - 1·23 ; p=0·23 ) . At 30 days , 42 of 3507 patients in the radial group had large haematoma compared with 106 of 3514 in the femoral group ( HR 0·40 , 95 % CI 0·28 - 0·57 ; p<0·0001 ) . Pseudoaneurysm needing closure occurred in seven of 3507 patients in the radial group compared with 23 of 3514 in the femoral group ( HR 0·30 , 95 % CI 0·13 - 0·71 ; p=0·006 ) . INTERPRETATION Radial and femoral approaches are both safe and effective for PCI . However , the lower rate of local vascular complications may be a reason to use the radial approach . FUNDING Sanofi-Aventis , Population Health Research Institute , and Canadian Network for Trials Internationally ( CANNeCTIN ) , an initiative of the Canadian Institutes of Health Research BACKGROUND Coronary angiography is a routine cardiac diagnostic procedure in Hong Kong . Patients are restricted to bedrest after the procedure due to potential vascular complications from using a femoral approach . Many patients are required to remain on bedrest for up to 24 hours after the procedure . The effects of reducing this bedrest time is still under investigation . In the meantime , nursing interventions aim ed at decreasing patient discomfort due to prolonged bedrest are feasible to implement . AIMS The aims of this study were to evaluate the severity of back pain related to bedrest duration after coronary angiography and to compare the effects of changing patients ' position in bed on their perceptions of back pain and on vascular complications . METHODS An experimental design was used , with patients r and omly assigned either to a control or experimental group . The control group received the usual care , remaining supine and flat for 8 - 24 hours , with the affected leg straight . The experimental group changed their body position hourly , varying between supine , right side-lying , and left side-lying during the first 7 hours after coronary angiography . RESULTS A total of 419 patients participated in the study ( control , n = 213 ; experimental , n = 206 ) . Regardless of group assignment , back pain intensity increased with longer time on bedrest . In addition , the control group reported higher levels of pain at all five assessment times . Vascular complications in terms of bleeding at the femoral site were not significantly different between the control and experimental groups . CONCLUSION The study findings suggest that patients may be able safely to change their position in bed earlier in the post-coronary angiography period than currently recommended in practice protocol s. Changing position in bed may also reduce back pain , promote physical comfort , and possibly reduce patients ' negative feelings toward coronary angiography OBJECTIVE To examine the effects that a modified positioning and mobilization routine had no back pain and delayed bleeding in patients who had received heparin and undergone cardiac angiography . DESIGN An experimental research design was used . Each patient was assigned r and omly to either the control group , which required 6 hours of bed rest after cardiac angiography , or the experimental group . The experimental group had modified positioning , in which the head of the bed was elevated to a maximum of 45 degrees , and modified mobilization , in which they were ambulated briefly at the bedside 4 hours after angiography . SETTING Two cardiology units of a 700-bed urban teaching hospital in western Canada . SAMPLE All patients admitted for nonemergent cardiac angiography were approached for consent , to attain a sample of 29 patients , and were r and omly assigned to the experimental or the control group . METHOD Each patient was r and omly assigned before cardiac angiography . The assignment was confidential until the patient was admitted to the cardiac unit after angiography . A demographic tool and the McGill Present Pain Intensity Scale were used to collect data . Perception of pain was evaluated over four observation periods . A research assistant monitored sanguineous drainage on the dressing and hematoma to evaluate the presence of delayed bleeding . DATA ANALYSIS Demographic information was analyzed primarily through descriptive statistics . Results were analyzed to compare back pain and delayed bleeding between the two groups . Wilcoxon scores and t tests both were used for analysis and correlated well with each other . RESULTS The group with the modified positioning and mobilization routine experienced significantly less pain overall ( p = 0.02 ) , less pain at each interval , and significantly less pain intensity ( p < 0.05 ) . There was no difference in bleeding . One person in each group had an estimated blood loss of more than 100 ml through the pressure dressing . CONCLUSION This pilot study supports our hypothesis that modifying the immobilization of patients after cardiac angiography is associated with a reduction in back pain and with no increase of delayed bleeding at the femoral access site . The results support the need for further investigation of ambulation interventions after cardiac angiography Transfemoral coronary angiography may cause acute and chronic complications . The aim of the present study was to assess the effects of changing the duration of keeping s and bag over the catheter insertion site on the acute complications of coronary angiography . This quasiexperimental study was conducted on 60 patients undergoing transfemoral coronary angiography . Participants were selected using convenience sampling and were r and omly assigned to intervention ( n = 30 ) or control group ( n = 30 ) . In the intervention group , the s and bag over the insertion site of catheter was taken off at the third hour , whereas in the control group , based on the routine care , s and bag was taken off at the sixth hour after the angiography . At the entrance hours , 3 , 6 , 8 , and 24 hours after the angiography , the patients in both groups were evaluated in terms of groin pain , low back pain , urinary retention , discomfort , and vascular complications . Data were analyzed by repeated measures , Mann-Whitney , Friedman , independent t-test , chi-square , and Kolmogorov-Smirnov tests . The two groups showed no significant difference in terms of demographic , clinical , and preinterventional catheterization characteristics ( P > .05 ) . Patients in both groups were examined at five time points in terms of groin pain ( P = .000 ) , back pain ( P = .000 ) , urinary retention ( P = .02 ) , and comfort ( P = .001 ) which were significantly different but with regard to vascular complications including hematoma ( P = .113 ) , bleeding ( P = .32 ) , and bruise ( P = .134 ) were not significantly different . The results of this study showed that removing the patients ' s and bag , who are under postangiography cares at the third hour , did not increase the incidence of vascular complications , whereas it decreased patients ' back pain , groin pain , and urinary retention and promotes their comfort
11,097
25,884,249
The most important harm identified was an increased body temperature after FMS or FMD treatments . From the twelve included trials there is no clear evidence that FMS or FMD provide additional benefit compared to conventional scaling and root planing .
BACKGROUND Periodontitis is chronic inflammation that causes damage to the soft tissues and bones supporting the teeth . Mild to moderate periodontitis affects up to 50 % of adults . Conventional treatment is quadrant scaling and root planing . In an attempt to enhance treatment outcomes , alternative protocol s for anti-infective periodontal therapy have been introduced : full-mouth scaling ( FMS ) and full-mouth disinfection ( FMD ) , which is scaling plus use of an antiseptic . This review up date s our previous review of full-mouth treatment modalities , which was published in 2008 . OBJECTIVES To evaluate the clinical effects of 1 ) full-mouth scaling ( over 24 hours ) or 2 ) full-mouth disinfection ( over 24 hours ) for the treatment of chronic periodontitis compared to conventional quadrant scaling and root planing ( over a series of visits at least one week apart ) . A secondary objective was to evaluate whether there was a difference in clinical effect between full-mouth disinfection and full-mouth scaling .
AIM This study aim ed to compare the effect of single-visit full-mouth mechanical debridement ( FMD ) and quadrant-wise mechanical debridement ( QMD ) on the levels of serum interleukin (IL)-6 , C-reactive protein ( CRP ) and soluble thrombomodulin . MATERIAL AND METHODS Thirty-six subjects with chronic periodontitis were r and omly allocated to three groups : undergoing QMD , single-visit FMD with povidone iodine or with water . Serum IL-6 and soluble thrombomodulin were measured by enzyme-linked immunosorbent assay , and serum CRP was measured by the latex-enhanced nephelometric method . RESULTS Serum IL-6 level increased significantly immediately after debridement in all the three groups , with this increase being greatest in the full-mouth groups . However , the increase in the full-mouth groups was not significantly higher than that of quadrant-wise group . In the quadrant-wise group , serum IL-6 level decreased significantly 1 month after debridement compared with baseline . Serum-soluble thrombomodulin decreased significantly in the full-mouth groups but not in the quadrant-wise group . Changes in CRP level were not significant at baseline or after debridement in all the three groups . CONCLUSIONS FMD increased serum IL-6 and reduced serum-soluble thrombomodulin to a greater extent than QMD , suggesting that the former technique has stronger transient effects on systemic vascular endothelial functions than the latter A classical treatment for chronic adult periodontitis consists of four to six consecutive sessions of scaling and root planing at a 1- to 2-week interval . Such a so-called " quadrant or sextant therapy " might result in a reinfection of a previously disinfected area by bacteria from an untreated region . The purpose of this study was to investigate , over an 8-month period , the clinical benefits of full-mouth disinfection within a 24-hour period in the control of chronic periodontitis . Ten adult patients with advanced chronic periodontitis were r and omly assigned to a test and a control group . The control group received the st and ard scheme of initial periodontal therapy , consisting of scaling and root planing of the four quadrants was performed within 24 hours and immediately followed by a thorough supra- and subgingival chlorhexidine application to limit any transfer of bacteria . The latter involved tongue brushing with a 1 % chlorhexidine gel for 60 seconds , mouthrinsing with a 0.2 % chlorhexidine solution twice for 60 seconds , repeated subgingival irrigation of all pockets with a 1 % chlorhexidine gel ( 3 times within 10 minutes ) , and mouthrinsing twice daily with a 0.2 % chlorhexidine solution during 2 weeks . In addition , both groups received thorough oral hygiene instructions . The plaque index , gingival index , probing depth , gingival recession , and bleeding on probing were recorded prior to professional cleaning and at 1 , 2 , 4 , and 8 months afterwards . Although the test group scored higher plaque indices than the control group , especially at months 2 and 4 , the gingival index and bleeding tendency showed similar improvements with time . However , when the gingival/plaque ratio was considered , the latter was lower in the test group at all follow-up visits . For pockets > or = 7 mm , full-mouth disinfection showed a significantly ( P = 0.01 ) higher reduction in probing depth at each follow-up visit with , at month 8 , a reduction of 4 mm ( from 8 mm to 4 mm ) , in comparison to 3 mm ( from 8 mm to 5 mm ) for the classical therapy . The increase in gingival recession in the full-mouth disinfection group remained below 0.7 mm , while in the control group it reached 1.9 mm after 8 months . This result ed in a gain of clinical attachment level of 3.7 mm for the test group versus 1.9 mm for the control group . A radiographical examination also indicated a superior improvement for the test group when compared to the control group . This pilot study suggests that a full-mouth disinfection in one day results in an improved clinical outcome in chronic periodontitis as compared to scalings per quadrant at 2-week intervals over several weeks A treatment for periodontal infections often consists of consecutive rootplanings ( per quadrant , at a 1- to 2-week interval ) , without a proper disinfection of the remaining intra-oral niches ( untreated pockets , tongue , saliva , mucosa and tonsils ) . Such an approach , could theoretically lead to a reinfection of previously-treated pockets . The present study aims to examine the effect of a full-mouth disinfection on the microbiota in the above-mentioned niches . Moreover , the clinical benefit of such an approach was investigated . 16 patients with severe periodontitis were r and omly allocated to a test and a control group . The patients from the control group were scaled and rootplaned , per quadrant , at 2-week intervals and obtained oral hygiene instructions . The patients from the test group received a full-mouth disinfection consisting of : scaling and rootplaning of all pockets in 2 visits within 24 h , in combination with tongue brushing with 1 % chlorhexidine gel for 1 min , mouth rinsing with a 0.2 % chlorhexidine solution for 2 min and subgingival irrigation of all pockets ( 3x in 10 min ) with 1 % chlorhexidine gel . Besides oral hygiene , the test group rinsed 2x daily with 0.2 % chlorhexidine and sprayed the tonsils with a 0.2 % chlorhexidine for 2 months . Plaque sample s ( pockets , tongue , mucosa and saliva ) were taken at baseline and after 2 and 4 months , and changes in probing depth , attachment level and bleeding on probing were reported . The full-mouth disinfection result ed in a statistically significant additional reduction/elimination of periodontopathogens , especially in the subgingival pockets , but also in the other niches . These microbiological improvements were reflected in a statistically-significant higher probing depth reduction and attachment gain in the test patients . These findings suggest that a disinfection of all intra-oral niches within a short time span leads to significant clinical and microbiological improvements for up to 4 months AIM To test recolonization of periodontal lesions after full-mouth scaling and root planing ( FM-SRP ) or multiple session-SRP ( MS-SRP ) in a r and omized clinical trial and whether FM-SRP and MS-SRP result in different clinical outcomes . MATERIAL S AND METHODS Thirty-nine subjects were r and omly assigned to FM-SRP or MS-SRP groups . At baseline and after 3 months , probing pocket depth ( PPD ) , plaque index ( PlI ) and bleeding on probing ( BoP ) were recorded . At baseline , immediately after treatment , after 1 , 2 , 7 , 14 and 90 days , paper point sample s from a single site from the maxillary right quadrant were collected for microbiological analysis of five putative pathogens by polymerase chain reaction . RESULTS FM-SRP and MS-SRP result ed in significant reductions in PPD , BoP and PlI and the overall detection frequencies of the five species after 3 months without significant differences between treatments . Compared with MS-SRP , FM-SRP result ed in less recolonization of the five species , significantly for Treponema denticola , in the tested sites . CONCLUSION FM-SRP and MS-SRP result in overall clinical ly and microbiologically comparable outcomes where recolonization of periodontal lesions may be better prevented by FM-SRP OBJECTIVES The beneficial effects of the one-stage , full-mouth disinfection remain controversial in the scientific literature . This might be due to the fact that an entire mouth disinfection with the use of antiseptics has been confused with a full-mouth scaling and root planing . This parallel , single blind RCT study aim ed to compare several full-mouth treatment strategies with each other . MATERIAL AND METHODS Seventy-one patients with moderate periodontitis were r and omly allocated to one of the following treatment strategies : scaling and root planing , quadrant by quadrant , at two-week intervals ( negative control , NC ) , full-mouth scaling and root planing within 2 consecutive days ( FRP ) , or three one-stage , full-mouth disinfection ( FM ) protocol s within 2 consecutive days applying antiseptics to all intra-oral niches for periopathogens using as antiseptics : chlorhexidine ( FMCHX ) for 2 months , amine fluoride/stannous fluoride for 2 months ( FMF ) , or chlorhexidine for 2 months followed by amine fluoride/stannous fluoride for another 6 months ( FMCHX+F ) . At baseline and after 2 , 4 , and 8 a series of periodontal parameters were recorded . RESULTS All treatment strategies result ed in significant ( p<0.05 ) improvements of all clinical parameters over the entire duration of the study . Inter-treatment differences were often encountered . The NC group nearly always showed significant smaller improvements than the two CHX groups . The differences between the FRP or FM groups , and the two CHX groups only sporadically reached a statistical significance . CONCLUSION These observations indicate that the benefits of the " OSFMD " protocol are partially due to the use of the antiseptics and partially to the completion of the therapy in a short time BACKGROUND / AIMS Recent studies reported significant additional clinical and microbiological improvements when severe adult periodontitis was treated by means of a " one-stage full-mouth " disinfection instead of a st and ard treatment strategy with consecutive root planings quadrant per quadrant . The one stage full-mouth disinfection procedure involves scaling and root planing of all pockets within 24 h in combination with an extensive application of chlorhexidine to all intra-oral niches such as periodontal pockets , tongue dorsum , tonsils ( chairside , and at home for 2 months ) . This study aims to examine the relative importance of the use of chlorhexidine in the one stage full-mouth disinfection protocol . METHODS Therefore , 3 groups of 12 patients each with advanced periodontitis were followed , both from a clinical and microbiological point of view , over a period of 8 months . The patients from the control group were scaled and root planed , quadrant per quadrant . at two-week intervals . The 2 other groups underwent a one stage full-mouth scaling and root planing ( all pockets within 24 h ) with ( Fdis ) or without ( FRp = full-mouth root planing ) the adjunctive use of chlorhexidine . At baseline and after 1 , 2 , 4 and 8 months , the following clinical parameters were recorded : plaque and gingivitis indices , probing depth , bleeding on probing and clinical attachment level . Microbiological sample s were taken from different intra-oral niches ( tongue , mucosa , saliva and pooled sample s from single- and multi-rooted teeth ) . The sample s were cultured on selective and non-selective media in order to evaluate the number of CFU/ml for the key-periodontopathogens . At baseline , an anonymous question naire was given to the patients to record the perception of each treatment ( post operative pain , fever , swelling etc . ) . RESULTS All 3 treatment strategies result ed in significant improvements for all clinical parameters , but the Fdis and FRp patients reacted always significantly more favourably than the control group , with an additional probing depth reduction of + /- 1.5 mm and an additional gain in attachment of + /- 2 mm ( for pockets > or = 7 mm ) . Also from a microbiological point of view both the FRp and Fdis patients showed additional improvements when compared to the control group , as well in the reduction of spirochetes and motile organisms as in the number of CFU/ml of the key-pathogens , especially when the subgingival plaque sample s were considered . The differences between FRp and Fdis patients were negligible . CONCLUSIONS These findings suggest that the benefits of a " one-stage full-mouth disinfection " in the treatment of patients suffering from severe adult periodontitis probably results from the full-mouth scaling and root planing within 24 h rather than the beneficial effect of chlorhexidine . The raise in body temperature the second day after the full-mouth scaling and root planing seems to indicate a Shwartzman reaction PURPOSE To explore the clinical effect of full-mouth scaling and root planning ( FM-SRP ) on chronic periodontitis with the comparison of quadrant scaling and root planning ( Q-SRP ) . METHODS 60 patients with chronic periodontitis were r and omly divided into 2 groups . The FM-SRP group received full-mouth scaling and root planning completed within the same day , while the Q-SRP group received quadrant scaling and root planning once a week for 4 weeks . Clinical parameters of plaque index ( PI ) , gingival index ( GI ) , bleeding on probing ( BOP ) , probing depth ( PD ) and attachment loss ( AL ) were collected at baseline , 3 and 6 months after treatment , as well as the postoperative reaction . The data were analyzed by rank sum test ( PI , GI ) , t test ( PD , AL ) and Chi square test ( BOP ) , respectively . RESULTS When compared with the baseline , both therapies result ed in significant improvements in all clinical parameters at the end of 3 and 6 months ( P<0.01 ) . However , there were no significant difference between the two groups at any time point ( P>0.05 ) . 24 hours after the first treatment , the percentage of patients with postoperative reactions were significantly higher in FM-SRP group than that in Q-SRP group ( P<0.05 ) , but the patients could tolerate these reactions . CONCLUSION Both FM-SRP and Q-SRP are efficacious in treating chronic periodontitis , and the clinician can select the proper treatment modality according to the dem and of clinical practice AIM To clinical ly , microbiologically and immunologically characterize periodontal debridement as a therapeutic approach for severe chronic periodontitis . MATERIAL AND METHODS Twenty-five patients presenting at least eight teeth with a probing pocket depth ( PPD ) of > or=5 mm and bleeding on probing ( BOP ) were selected and r and omly assigned to quadrant-wise scaling and root planing or one session of full-mouth periodontal debridement . The following clinical outcomes were assessed : plaque index , BOP , position of gingival margin , relative attachment level ( RAL ) and PPD . Real-time PCR was used for quantitative analysis of Aggregatibacter actinomycetemcomitans , Porphyromonas gingivalis and Tannerella forsythia . The enzyme-linked immunosorbent assay permitted the detection of IL-1beta , prostagl and in E(2 ) , INF-gamma and IL-10 in gingival crevicular fluid ( GCF ) . All the parameters were evaluated at baseline , and at 3 and 6 months after treatment . RESULTS Both the groups had similar means of PPD reduction and attachment gain over time . Besides a significant reduction in the bacterial level after treatment in both groups , microbiological analysis failed to demonstrate significant differences between them . Finally , no difference was observed between groups with respect to the levels of inflammatory mediators in GCF . CONCLUSION Periodontal debridement result ed in a similar clinical , microbiological and immunological outcome when compared with st and ard scaling and root planing and therefore may be a viable approach to deal with severe chronic periodontitis BACKGROUND The aim of the present study was to evaluate the effectiveness of non-surgical mechanical instrumentation at 2 different time intervals on short-term healing and to assess patient reactions following non-surgical periodontal therapy . METHODS The study population consisted of 100 patients with moderate periodontal disease . Patients were equally distributed into 2 groups , treated daily or weekly . The daily group received full-mouth daily scaling and root planing for 4 consecutive days . The weekly group was treated once a week for 4 weeks . All patients were asked for objective ( lymphadenopathy , aphthous stomatitis , and edema ) and subjective ( fatigue , pain , pruritus , burning sensation , and dentinalgia ) reactions . Clinical measurements of plaque index ( PI ) , gingival index ( GI ) , probing depth ( PD ) , bleeding on probing ( BOP ) , and gingival recession ( GR ) were taken at baseline and 3 months after treatment . All of the objective and subjective reactions were recorded after each treatment session . RESULTS The results of our study revealed a significant decrease in PI , GI , BOP , and PD measurements at the end of the third month , but no significant changes in GR . The incidence of subjective and objective reactions was higher in the daily treated group compared to those in the weekly group . Most of these complaints were observed after the third treatment session . CONCLUSIONS Within the limits of this study , no differences were observed between the study groups when the clinical parameters were evaluated . However , taking the subjective and objective reactions into consideration , the smallest time interval for non-surgical periodontal procedures might be 1 week BACKGROUND The aim of the present study was to evaluate the clinical effects of one-stage periodontal debridement with an ultrasonic instrument , associated with 0.5 % povidone (pvp)-iodine irrigation in patients with chronic periodontitis . METHODS Forty-five patients were r and omly assigned into three groups : the control group ( CG ) received quadrant root planing at 1-week intervals over four consecutive sessions ; the periodontal debridement plus pvp-iodine group ( PD-PIG ) received a 45-minute full-mouth debridement with an ultrasonic instrument , associated with 0.5 % pvp-iodine irrigation ; and the periodontal debridement group ( PDG ) received a 45-minute full-mouth periodontal debridement with an ultrasonic instrument , associated with NaCl irrigation . RESULTS At the 3-month evaluation , the mean probing depth ( PD ) reduction in CG was 2.51+/-0.52 mm , 2.53+/-0.50 mm in PD-PIG , and 2.58+/-0.60 mm in PDG ( P<0.05 ) . The clinical attachment level ( CAL ) analysis showed a statistically significant gain in all groups compared to baseline ( 1.87+/-0.56 mm [ CG ] , 1.94+/-0.70 mm [ PD-PIG ] , and 1.99+/-0.92 mm [ PDG ] ) . Intergroup analysis of PD and CAL at 1 and 3 months showed no differences ( P>0.05 ) . The N-benzoyl-L-arginine-p-nitroanilide ( BAPNA ) test showed a significant reduction in trypsin activity only during the first month ( P<0.05 ) ; at 3 months there were no differences compared to baseline ( P=0.80 ) . CONCLUSION This study provides no evidence that pvp-iodine is effective as an adjunct for one-stage periodontal debridement The aim of this r and omized clinical study was to compare full-mouth scaling and root planing ( FM-SRP ) in two sessions within 24 hours with quadrant-wise scaling and root planing ( Q-SRP ) in four sessions within 4 - 6 weeks and evaluate ( I ) clinical outcome , ( II ) treatment efficiency , and ( III ) treatment discomfort of patients and therapists . Twenty individuals , aged 28 - 65 years , with severe chronic periodontitis were r and omly assigned to treatment with FM-SRP or Q-SRP . At baseline and after 6 months , there were no between-group differences in clinical findings , treatment discomfort , or post-treatment body temperature . The therapists , however , felt that FM-SRP was more physically and psychologically dem and ing than Q-SRP . Mean effective scaling and root planing ( SRP ) time was 165.5 min during the two FM-SRP sessions and 202.1 min during the four Q-SRP sessions . FM-SRP 's initial time savings of 36.6 min compared with Q-SRP diminished to 30.8 min at the 6-month follow-up due to rescaling needs . Total mean treatment time ( comprising SRP and patient reinformation and reinstruction in oral hygiene ) during the first 6 months post-treatment was 321.2 min for FM-SRP and 353.0 min for Q-SRP . Thus , mean savings in total treatment time with FM-SRP was 31.8 min compared with Q-SRP . In conclusion , this study found that both treatment modalities may be recommended for chronic periodontitis patients . Although time saving is possible with FM-SRP , the modality may compromise the therapist 's well-being if practice d frequently due to the risk of musculoskeletal problems OBJECTIVE The aim of this study was to test the hypothesis that the one-stage full-mouth disinfection ( FMD ) provides greater clinical and microbiological improvement compared with full-mouth scaling and root planing ( FM-SRP ) within 24 h and quadrant scaling and root planing ( Q-SRP ) in patients with generalized chronic periodontitis . MATERIAL & METHODS Twenty-eight patients were r and omized into three groups . 25 patients completed the study and were the basis for analysis . The Q-SRP group was scaled quadrant-wise at 1-week intervals . The other groups received a one-stage full-mouth scaling with ( FMD ) and without ( FM-SRP ) chlorhexidine . At baseline , after 1 , 2 , 4 and 8 months clinical parameters were recorded and microbiological analysis was performed . RESULTS All three treatment modalities result ed in significant clinical improvement at any time . There were only group differences after 1 and 2 months : in the FM-SRP group was a significantly higher reduction of probing depth and bleeding on probing compared with the other two groups . The bacteria could be reduced in every group although this reduction was only significant for Prevotella intermedia in the FMD group 8 months after treatment . CONCLUSION All three treatment modalities lead to an improvement of the clinical and microbiological parameters , however , without significant group differences after 8 months In a st and ard periodontal treatment strategy with consecutive root planings ( per quadrant at a one- to two-week interval ) , re-infection of a disinfected area might occur before completion of the treatment . This study examines , both clinical ly and microbiologically , whether a full-mouth disinfection within 24 hours significantly improves the outcome of periodontal treatment . Ten patients with advanced chronic periodontitis were r and omly allocated to a test and a control group . The patients from the control group received scalings and root planings as well as oral hygiene instructions per quadrant at two-week intervals . Full-mouth disinfection in the test group was sought by the removal of all plaque and calculus ( in two visits within 24 hours ) . In addition , at each of these visits , the tongue was brushed with a 1 % chlorhexidine gel for one min and the mouth rinsed with a 0.2 % chlorhexidine solution for two min . Furthermore , subgingival chlorhexidine ( 1 % ) irrigation was performed in all pockets . The recolonization of the pockets was retarded by oral hygiene and 0.2 % chlorhexidine rinses during two weeks . The clinical parameters were recorded , and plaque sample s were taken from the right upper quadrant at baseline and after one and two months . The test group patients showed a significantly higher reduction in probing depth for deep pockets at both follow-up visits ( p < 0.05 ) . At the one-month visit , differential phase-contrast microscopy revealed significantly lower proportions of spirochetes and motile rods in the test group ( p = 0.01 ) . Culturing showed that the test group harbored significantly fewer pathogenic organisms at one month ( p = 0.005 ) . At two months , the same sites harbored singificantly more " beneficial " bacteria ( p = 0.02 ) . Moreover , all sites of the test group initially harboring P. gingivalis ( 6/10 ) became negative after treatment . These findings suggest that it is possible to achieve a significant improvement of the treatment outcome ( both microbiologically and clinical ly ) with a one-stage full-mouth disinfection BACKGROUND The aim of this r and omized controlled clinical trial was to determine the effects of single-visit full-mouth ultrasonic debridement versus quadrant-wise therapy . MATERIAL AND METHODS Thirty-six subjects with chronic periodontitis , were r and omly allocated to three groups -- quadrant-wise ultrasonic debridement , single-visit full-mouth ultrasonic debridement with povidone iodine and single-visit full-mouth ultrasonic debridement with water . Whole-mouth plaque , bleeding on probing ( BOP ) , pocket depth and attachment level were recorded before treatment and 1 , 3 and 6 months post-treatment . Plaque and saliva sample s were collected for microbiological analysis . RESULTS After treatment , all groups showed significant improvement in clinical parameters . Full-mouth treatments result ed in similar improvements in full-mouth mean plaque percentage , probing pocket depth and probing attachment level as conventional therapy . When data were analysed based on pocket depth and tooth type , there was no difference between groups in probing depth reduction or attachment gains . The full-mouth groups demonstrated greater reduction in BOP% and number of pockets > or = 5 mm and the total treatment time was significantly shorter . The detection frequencies of periodontal pathogens in plaque and saliva showed slight changes with no difference between groups . CONCLUSION Single-visit full-mouth mechanical debridement may have limited additional benefits over quadrant-wise therapy in the treatment of periodontitis , but can be completed in a shorter time AIM To evaluate the clinical efficacy of ( i ) a single session of " full-mouth ultrasonic debridement " ( Fm-UD ) as an initial periodontal treatment approach and ( ii ) re-instrumentation of periodontal pockets not properly responding to initial subgingival instrumentation . METHODS Forty-one patients , having on the average 35 periodontal sites with probing pocket depth ( PPD ) > or = 5 mm , were r and omly assigned to two different treatment protocol s following stratification for smoking : a single session of full-mouth subgingival instrumentation using a piezoceramic ultrasonic device ( EMS PiezonMaster 400 , A+PerioSlim tips ) with water coolant ( Fm-UD ) or quadrant scaling/root planing ( Q-SRP ) with h and instruments . At 3 months , all sites with remaining PPD > or = 5 mm were subjected to repeated debridement with either the ultrasonic device or h and instruments . Plaque , PPD , relative attachment level ( RAL ) and bleeding following pocket probing ( BoP ) were assessed at baseline , 3 and 6 months . Primary efficacy variables were percentage of " closed pockets " ( PPD < or = 4 mm ) , and changes in BoP , PPD and RAL . RESULTS The percentage of " closed pockets " was 58 % at 3 months for the Fm-UD approach and 66 % for the Q-SRP approach ( p>0.05 ) . Both treatment groups showed a mean reduction in PPD of 1.8 mm , while the mean RAL gain amounted to 1.3 mm for Fm-UD and 1.2 mm for Q-SRP ( p>0.05 ) . The re-treatment at 3 months result ed in a further mean PPD reduction of 0.4 mm and RAL gain of 0.3 mm at 6 months , independent of the use of ultrasonic or h and instruments . The efficiency of the initial treatment phase ( time used for instrumentation/number of pockets closed ) was significantly higher for the Fm-UD than the Q-SRP approach : 3.3 versus 8.8 min . per closed pocket ( p<0.01 ) . The efficiency of the re-treatment session at 3 months was 11.5 min . for ultrasonic and 12.6 min . for h and instrumentation ( p>0.05 ) . CONCLUSION The results demonstrated that a single session of Fm-UD is a justified initial treatment approach that offers tangible benefits for the chronic periodontitis patient OBJECTIVES To determine the clinical effects of full mouth compared with quadrant wise scaling and root planing . METHOD Twenty patients with chronic periodontitis ( > or = 2 teeth per quadrant with probing pocket depths ( PPD ) > or = 5 mm and bleeding on probing ( BOP ) were r and omized into a test group treated in two sessions with subgingival scaling and root planing within 24 h ( full-mouth root planing ( FMRP ) ) and a control group treated quadrant by quadrant in four sessions in intervals of 1 week ( quadrant root planing ( QRP ) ) . PPD , relative attachment level ( RAL ) and BOP were recorded at baseline , 3 and 6 months . RESULTS Analysing first quadrant data , in moderately deep pockets ( 5 mm < or = PPD < 7 mm ) there was no evidence for a difference ( FMRP-QRP ) between both groups for PPD reduction ( mean : -0.128 mm ; CI : [ -0.949 , 0.693 ] ; p=0.747 ) , RAL gain ( mean : 0.118 mm ; CI : [ -0.763 , 1.000 ] ; p=0.781 ) , and BOP reduction ( mean : -20.1 % ; CI : [ -44.3 , 4.2 ] ; p=0.099 ) . Likewise , no significant differences between treatments were found for initially deep pockets ( PPD > or = 7 mm ) , neither for first quadrant nor for whole mouth data . CONCLUSION The results of the present study demonstrated equally favourable clinical results following both treatment modalities BACKGROUND A st and ard treatment strategy for periodontal infections often consists of 4 consecutive sessions of scaling and root planing ( per quadrant , at 1- to 2-week intervals ) , without proper disinfection of the remaining intra-oral niches for periodontopathogens . This could theoretically lead to a reinfection of previously disinfected pockets by bacteria from an untreated region/niche . This study aim ed to investigate , over an 8-month period , the clinical benefits of a one stage full-mouth disinfection in the control of severe periodontitis . METHODS Sixteen patients with early-onset periodontitis and 24 patients with severe adult periodontitis were r and omly assigned to test and control groups . The control group was scaled and root planed , per quadrant , at 2-week intervals and given st and ard oral hygiene instructions . A one stage full-mouth disinfection ( test group ) was sought by scaling and root planing the 4 quadrants within 24 hours in combination with the application of chlorhexidine to all intra-oral niches for periodontopathogens . Besides oral hygiene , the test group also rinsed twice daily with a 0.2 % chlorhexidine solution and sprayed the tonsils with a 0.2 % chlorhexidine spray , for 2 months . The plaque index , gingival index , probing depth , bleeding on probing , gingival recession , and clinical attachment level were recorded at baseline and at 1 , 2 , 4 , and 8 months afterwards . RESULTS The one stage full-mouth disinfection result ed , in comparison to the st and ard therapy , in a significant ( P < 0.001 ) additional probing depth reduction and gain in attachment up to 8 months . For initial pockets > or = 7 mm , the " additional " probing depth reduction at the 8 month follow-up was 1.2 mm for single-rooted and 0.9 mm for multi-rooted teeth , with corresponding additional gains in attachment of 1.0 mm and 0.8 mm , respectively . The additional improvements were observed for all subgroups ( adult periodontitis , generalized early-onset cases , smokers ) , with the largest differences in the non-smoking adult periodontitis patients . CONCLUSIONS These findings suggest that a one stage full-mouth disinfection results in an improved clinical outcome for the treatment of chronic adult or early-onset periodontitis as compared to scaling and root planing per quadrant at 2-week intervals BACKGROUND The benefit of full-mouth disinfection ( FDIS ) over traditional scaling and root planing ( SRP ) remains equivocal , and it is not known whether the use of adjunctive antibiotics may enhance the effect of FDIS . The aim of the present study is to test the hypothesis that there is no difference in the 1-year clinical outcome of therapy among groups of patients treated with conventional SRP performed over 2 to 3 weeks , or same-day FDIS , with or without adjunctive metronidazole . METHODS A total of 184 patients with moderate-to-severe periodontitis were r and omly allocated to one of four treatment groups : 1 ) FDIS+metronidazole ; 2 ) FDIS+placebo ; 3 ) SRP+metronidazole ; or 4 ) SRP+placebo . Recordings of plaque , bleeding on probing , probing depth ( PD ) , and clinical attachment level ( CAL ) were carried out in four sites per tooth at baseline and at 3 and 12 months after treatment . RESULTS No differences were observed in the mean CAL or PD values between the four experimental groups at baseline and 3 or 12 months post-treatment . All four groups displayed significant improvements in all parameters . However , using absence of pockets ≥5 mm as the criterion for treatment success , the two groups receiving adjunctive metronidazole performed significantly better than the two placebo groups . CONCLUSION Metronidazole had a significant , adjunctive effect in patients with a metronidazole-sensitive subgingival microbiota on the clinical parameters of CAL , PD , and absence of pockets ≥5 mm BACKGROUND Full-mouth scaling ( FMS ) is cl aim ed by some research ers to be superior to st and ard scaling and root planing ( SRP ) . The aim of the present study was to evaluate clinical outcomes of two modalities of non-surgical periodontal therapy for patients with chronic periodontitis . METHODS In a prospect i ve , r and omized , controlled clinical study , 37 subjects with chronic periodontitis were treated by SRP in two quadrants at 4-week intervals ( N=20 ) or by FMS ( N=17 ) . Clinical attachment level ( CAL ) , probing depth ( PD ) , and bleeding on probing ( BOP ) were recorded at premolar and molar teeth at baseline and after 6 and 12 months . RESULTS Both therapies result ed in significant improvements of all clinical variables . After 12 months , CAL at pockets with PDs of 4 to 6 mm was reduced significantly from 4.5+/-0.8 mm to 3.4+/-1.0 mm with SRP and from 4.7+/-0.9 mm to 3.8+/-1.1 mm with FMS ( P<0.001 ) . PD decreased from 4.4+/-0.6 mm to 3.3+/-0.9 mm in the SRP group and from 4.5+/-0.7 mm to 3.5+/-1.0 mm in the FMS group ( P<0.001 ) . BOP was reduced from 63.6%+/-45.3 % to 29.0%+/-42.6 % in the SRP group and from 59.6%+/-43.8 % to 28.6%+/-38.3 % in the FMS group ( P<0.001 and P=0.001 , respectively ) . There were no significant differences between the groups with respect to CAL gain , PD , and BOP reduction . CONCLUSION Both therapy modalities have the same positive influence on clinical outcome at premolar and molar teeth with PDs of 4 to 6 mm OBJECTIVE To compare the clinical and microbiological effects of three protocol s for nonsurgical periodontal therapy , including full-mouth scaling and root planing plus systemic antibiotics , on the treatment of chronic periodontitis patients . METHODS Twenty-nine patients diagnosed with moderate to severe chronic periodontitis , selected according to specific criteria , were r and omly assigned to one of three treatment groups : quadrant scaling , full-mouth scaling , and full-mouth scaling supplemented by systemic antibiotics . Antibiotic selection was based on the results of individual susceptibility testing . Oral hygiene instructions and reinforcement were given during the study . All patients received a clinical periodontal and microbiological examination at baseline and at reexamination , 4 - 6 weeks after therapy . Means and st and ard deviations were calculated and differences between groups were analyzed via the Kruskal-Wallis test , p < 0.05 . RESULTS The mean age of the study sample was 49.1 + 11.6 years old , and there were 17 men and 12 women . Patients treated with antibiotics showed antimicrobial susceptibility for amoxicillin and doxycycline . All study groups showed a similar significant improvement in periodontal parameters . Plaque scores were reduced in a range of 29.0 % to 42.6 % . Bleeding on probing was reduced by 34.8 % to 55.0 % ; the reduction for the full-mouth scaling group was larger . Mean reduction in pocket depth was 1.2 to 1.3 mm in all groups . Mean bacterial counts were reduced in the groups receiving full-mouth treatment , but not in the quadrant treatment group . CONCLUSION The three protocol s for non-surgical periodontal treatment demonstrated a similar positive effect on clinical parameters ; however , only full-mouth treatment groups showed a reduction in anaerobic microbial counts at re-examination
11,098
26,599,520
Most interventions successfully increased water intake with 13 studies reporting an increase of at least 500 mL. The most effective strategies were instruction and self-monitoring using urine dipstick or 24 h urine volume . CONCLUSION All interventions carried out in the studies succeeded in increasing water intake , with none leading to decreases in intake , and these could be implemented in potential clinical trials in CKD .
BACKGROUND Maintaining adequate fluid intake has been hypothesized to be beneficial for the progression of chronic kidney disease ( CKD ) . The aim of this study was to undertake a systematic review to determine the most effective interventions to increase water intake .
High fluid intake is the only preventive dietary measure that can be recommended to all patients with stones . However , the efficacy of dietary advice given to patients is unknown . We compared the impact of dietary advice to increase hydration ( group 1 , 57 patients ) and of no dietary advice ( group 2 , 83 patients ) on 24-hour urine volume . No significant difference was noted between groups 1 ( 1,624 ml . ) and 2 ( 1,732 ml . ) . We then determined if urine specific gravity dipsticks could help patients increase the 24-hour urine volume . A correlation between 24-hour urine volume and mean urine specific gravity was performed on 263 r and omly chosen patients . There was an inverse relationship between urine specific gravity and 24-hour urine volume with a correlation coefficient of 0.522 ( y = 1.0207 - 0.00374x ) . Most patients ( 81.6 % ) with 24-hour urine volumes of less than 2.1 had a urine specific gravity of more than 1.010 . The use of specific gravity dipsticks was evaluated as a tool to help 24 patients increase the 24-hour urine volume . The 24-hour urine volume increased significantly ( p less than 0.05 , paired Student 's t test ) in patients after feedback from specific gravity dipsticks when they were instructed to keep the urine specific gravity at or less than 1.010 ( average 24-hour urine volume increased 192 % ) . We conclude that dietary advice may be insufficient to modify fluid intake habits in stone patients . However , modifications of fluid intake habits may be improved by feedback from specific gravity dipsticks Background and objectives Increased water intake may benefit kidney function . Prior to initiating a larger r and omised controlled trial ( RCT ) , we examined the safety and feasibility of asking adults with chronic kidney disease ( CKD ) to increase their water intake . Design , setting , participants and measurements Beginning in October 2012 , we r and omly assigned 29 adults with stage 3 CKD ( estimated glomerular filtration rate ( eGFR ) 30–60 mL/min/1.73 m2 and albuminuria ) to one of the two groups of water intake : hydration ( n=18 ) or st and ard ( n=11 ) . We asked the hydration group to increase their water intake by 1.0–1.5 L/day ( in addition to usual intake , depending on sex and weight ) for 6 weeks , while the control group carried on with their usual intake . Participants collected a 24 h urine sample at baseline and at 2 and 6 weeks after r and omisation . Our primary outcome was the between-group difference in change in 24 h urine volume from baseline to 6 weeks . Results (63%)of participants were men , 81 % were Caucasians and the average age was 61 years ( SD 14 years ) . The average baseline eGFR was 40 mL/min/1.73 m2 ( SD 11 mL/min/1.73 m2 ) ; the median albumin to creatinine ratio was 19 mg/mmol ( IQR 6–74 mg/mmol ) . Between baseline and 6-week follow-up , the hydration group 's average 24 h urine volume increased by 0.7 L/day ( from 2.3 to 3.0 L/day ) and the control group 's 24 h urine decreased by 0.3 L/day ( from 2.0 to 1.7 L/day ; between-group difference in change : 0.9 L/day ( 95 % CI 0.4 to 1.5 ; p=0.002 ) ) . We found no significant changes in urine , serum osmolality or electrolyte concentrations , or eGFR . No serious adverse events or changes in quality of life were reported . Conclusions A pilot RCT indicates adults with stage 3 CKD can successfully and safely increase water intake by up to 0.7 L/day in addition to usual fluid intake . Trial registration Registered with Clinical Trials — government identifier NCT01753466 OBJECTIVES Several animal studies have shown that bladder performance improves as a result of diuresis . Whether increased urine output also has beneficial effects on elderly male bladder function and lower urinary tract symptoms is unknown . METHODS We performed a r and omized placebo-controlled trial of 141 men , 55 to 75 years of age , with moderate lower urinary tract symptoms . The experimental group drank 1.5 L of extra water daily . The control group consumed one tablespoon of placebo syrup daily . After 6 months , we evaluated bladder contractility , voided volumes , and the severity of lower urinary tract symptoms . The actual increase in water consumption was measured using the deuterium urine dilution method . RESULTS Water consumption in the intervention group increased by 359 mL ( 95 % confidence interval [ CI ] 171 to 548 ) per 24 hours compared with the control group . At 6 months , no statistically significant effect was found in the maximal flow rate ( 0.9 mL/s , 95 % CI -0.4 to 2.2 ) compared with placebo . A statistically significant effect was found for bladder pressure ( 20 cm H2O , 95 % CI 6 to 34 ) and bladder wall stress ( 1.9 N/cm2 , 95 % CI 0.3 to 3.5 ) . In addition , it showed that the experimental group had greater maximal ( 44 mL , 95 % CI -1 to 90 ) and average ( 26 mL , 95 % CI 1 to 51 ) voided volumes per urination . The subjective effect parameters improved in both groups , but no statistically significant differences were found between the two groups . CONCLUSIONS It seems possible to improve some aspects of male bladder function by drinking more water . However , the effects are too small to be clinical ly relevant Background : Nephrolithiasis is a recurrent disease , and one of the most effective methods for prevention of stone recurrence is increasing the urine output ( > 2 L/day ) , but it is difficult to achieve it . The aim of this study was to evaluate the effect of behavioral intervention by measurement of urine specific gravity using dipstick on 24-h urine volume in first renal stone patients . Material s and Methods : In this prospect i ve r and omize clinical study , 80 adult patients with history of first renal stone were included . Patients were divided into two groups with 40 patients in each group . We explained the importance of high fluid intake and high urine volume in the prevention of renal stones for all patients . Group A patients were trained to measure 24-h urine volume every 15 days , and group B patients were trained to keep urine specific gravity below 1.010 by using dipstick . We measured 24-h urine volume in each group before intervention , and at 3 months and 6 months after intervention and compared them . Results : There were no significant differences between the two groups in 24-h urine volume before intervention ( P = 0.41 ) , but it was significant 3 months ( P = 0.01 ) and 6 months ( P = 0.01 ) after intervention . Patients ′ compliance was 20 % in group A and 90 % in group B ( P < 0.05 ) . Conclusion : The use of behavioral modification with dipstick is an effective method for control and maintenance of optimal urine volume , and it has result ed in more patient compliance for drinking water and is more effective for prevention of renal stone Dehydration is commonly believed to result in headache , but the effectiveness of increasing the water intake in patients who frequently suffer from headaches has not been studied thus far . In a pilot study , we examined the possible effects and feasibility of increased water intake in headache patients . Eighteen headache patients ( all had migraine , two also had tension-type headache ) were r and omly allocated to placebo medication , or the advice to additionally drink 1.5 l of water per day , for a period of 12 weeks . Effect measurements consisted of a 2 weeks headache diary and the Migraine Specific Quality of Life ( MSQOL ) question naire . The advice to increase the daily fluid intake by 1.5 l increased the fluid intake in the intervention group by approximately 1 l. This reduced the total hours of headache in 2 weeks by 21 h ( 95 % CI : -48 to 5 ) . Mean headache intensity decreased by 13 mm ( 95 % CI : -32 to 5 ) on a visual analogue scale ( VAS ) . The effects on MSQOL , number of headache episodes , and medication seemed to be small . The data of the present study suggest a reduction in the total number of hours and intensity of headache episodes after increased water intake . Our results seem to justify larger scaled research on the effectiveness of increased water intake in headache patients PURPOSE We define the role of urine volume as a stone risk factor in idiopathic calcium stone disease and test the actual preventive effectiveness of a high water intake . MATERIAL S AND METHODS We studied 101 controls and 199 patients from the first idiopathic calcium stone episode . After a baseline study period the stone formers were divided by r and omization into 2 groups ( 1 and 2 ) and they were followed prospect ively for 5 years . Followup in group 1 only involved a high intake of water without any dietetic change , while followup in group 2 did not involve any treatment . Each year clinical , laboratory and radiological evaluation was obtained to determine urinary stone risk profile ( including relative supersaturations of calcium oxalate , brushite and uric acid by Equil 2 ) , recurrence rate and mean time to relapse . RESULTS The original urine volume was lower in male and female stone formers compared to controls ( men with calcium oxalate stones 1,057 + /- 238 ml./24 hours versus normal men 1,401 + /- 562 ml./24 hours , p < 0.0001 and women calcium oxalate stones 990 + /- 230 ml./24 hours versus normal women 1,239 + /- 440 ml./24 hours , p < 0.001 ) . During followup recurrences were noted within 5 years in 12 of 99 group 1 patients and in 27 of 100 group 2 patients ( p = 0.008 ) . The average interval for recurrences was 38.7 + /- 13.2 months in group 1 and 25.1 + /- 16.4 months in group 2 ( p = 0.016 ) . The relative supersaturations for calcium oxalate , brushite and uric acid were much greater in baseline urine of the stone patients in both groups compared to controls . During followup , baseline values decreased sharply only in group 1 . Finally the baseline urine in patients with recurrences was characterized by a higher calcium excretion compared to urine of the patients without recurrences in both groups . CONCLUSIONS We conclude that urine volume is a real stone risk factor in nephrolithiasis and that a large intake of water is the initial therapy for prevention of stone recurrences . In cases of hypercalciuria it is suitable to prescribe adjuvant specific diets or drug therapy BACKGROUND Previously published investigations suggest a positive effect of increased water intake on headache , but a r and omised controlled trial has not been done . OBJECTIVE To investigate the effects of increased water intake on headache . METHODS R and omised controlled trial in primary care with two groups and a follow-up period of 3 months . Patients were included if they had at least two episodes of moderately intense headache or at least five mildly intense episodes per month and a total fluid intake of less than 2.5 l/day . Both groups received written instructions about stress reduction and sleep improvement strategies . The intervention group additionally received the instruction to increase the daily water intake by 1.5 l. The main outcome measures were Migraine-Specific Quality of Life ( MSQOL ) and days with at least moderate headache per month . RESULTS We r and omised 50 patients to the control group and 52 patients to the intervention group . Drinking more water result ed in a statistically significant improvement of 4.5 ( confidence interval : 1.3 - 7.8 ) points on MSQOL . In addition , 47 % in the water group reported much improvement ( 6 or higher on a 10-point scale ) on perceived intervention effect against 25 % in the control group . However , drinking more water did not result in relevant changes in days with at least moderate headache . CONCLUSIONS Considering the observed positive subjective effects , it seems reasonable to recommend headache patients to try this non-invasive intervention for a short period of time to see whether they experience improvement Maintenance of weight loss remains a challenge for most individuals . Thus , practical and effective weight-loss maintenance ( WTLM ) strategies are needed . A two-group 12-month WTLM intervention trial was conducted from June 2007 to February 2010 to determine the feasibility and effectiveness of a WTLM intervention for older adults using daily self-monitoring of body weight , step count , fruit/vegetable ( F/V ) intake , and water consumption . Forty weight-reduced individuals ( mean weight lost=6.7±0.6 kg ; body mass index [ calculated as kg/m² ] 29.2±1.1 ) , age 63±1 years , who had previously participated in a 12-week r and omized controlled weight-loss intervention trial , were instructed to record daily body weight , step count , and F/V intake ( WEV [ defined as weight , exercise , and F/V ] ) . Experimental group ( WEV+ ) participants were also instructed to consume 16 fl oz of water before each main meal ( ie , three times daily ) , and to record daily water intake . Outcome measures included weight change , diet/physical activity behaviors , theoretical constructs related to health behaviors , and other clinical measures . Statistical analyses included growth curve analyses and repeated measures analysis of variance . Over 12 months , there was a linear decrease in weight ( β=-0.32 , P<0.001 ) and a quadratic trend ( β=0.02 , P<0.01 ) over time , but no group difference ( β=-0.23 , P=0.08 ) . Analysis of the 365 days of self-reported body weight for each participant determined that weight loss was greater over the study period in the WEV+ group than in the WEV group , corresponding to weight changes of -0.67 kg and 1.00 kg , respectively , and an 87 % greater weight loss ( β=-0.01 , P<0.01 ) . Overall compliance to daily tracking was 76%±5 % . Daily self-monitoring of weight , physical activity , and F/V consumption is a feasible and effective approach for maintaining weight loss for 12 months , and daily self-monitoring of increased water consumption may provide additional WTLM benefits BACKGROUND AND OBJECTIVES Autosomal dominant polycystic kidney disease ( ADPKD ) leads to kidney failure in half of those affected . Increased levels of adenosine 3':5'-cyclic monophosphate ( cAMP ) play a critical role in disease progression in animal models . Water loading , by suppressing arginine vasopressin (AVP)-stimulated cAMP production , is a proposed therapy for ADPKD . DESIGN , SETTING , PARTICIPANTS , & MEASUREMENTS The effects of acute and sustained water loading on levels of urine osmolality ( Uosm ) and cAMP in 13 subjects with ADPKD and 10 healthy controls were studied . Uosm and cAMP concentrations were measured before and after water loading . RESULTS Urine [ cAMP ] indexed to Uosm significantly decreased with acute water loading in both groups ( 58 % in controls and 35 % in ADPKD ) . Chronic water loading result ed in a nonsignificant 13 % decrease in 24-hour urine cAMP excretion in ADPKD participants , despite an increase in 24-hour urine volume by 64 % to 3.14 + /- 0.32 L and decrease in mean Uosm by 46 % , to below that of plasma ( 270 + /- 21 mOsm/L ) . CONCLUSIONS Increased water intake of 3 L per day decreased Uosm in most ADPKD subjects . While urine [ cAMP ] accurately reflects changes in Uosm during acute water loading in ADPKD subjects , chronic water loading did not lower 24-hour urine cAMP excretion , although subjects with higher baseline [ cAMP ] ( > 2 nmol/mg Cr ) responded best . Decreases in urine [ cAMP ] and osmolality are consistent with decreased AVP activity . These results support the need for a larger study to evaluate the effect of chronic water loading on ADPKD progression BACKGROUND AND OBJECTIVES In animal models of polycystic kidney disease , the ingestion of large amounts of water promotes diuresis by suppressing plasma levels of arginine vasopressin ( AVP ) and renal levels of cAMP , slowing cyst progression . Whether simple water ingestion is a potential therapeutic strategy for individuals with autosomal dominant polycystic kidney disease ( ADPKD ) is unknown . In this study , a simple method to quantify the amount of water to achieve a specific mean urine osmolality target in patients with ADPKD was developed and tested . DESIGN , SETTING , PARTICIPANTS , & MEASUREMENTS In eight ADPKD patients eating typical diets , osmolality and volume were measured in 24-hour urine collection s. The amount of additional ingested water required daily to achieve a mean urine osmolality of 285 ± 45 mosm/kg was determined . Participants were instructed to distribute the prescribed water over waking hours for each of 5 days . Blood chemistries , 24-hour urine collection s , BP , and weight were measured before and after the period of supplemental water intake . RESULTS Five patients achieved the 285 mosm/kg urine target without difficulty . Mean urine osmolality decreased and mean urine volume increased ; serum sodium , weight , and BP were unchanged . Daily osmolar excretion remained constant , indicating a stable ad lib dietary intake of solutes and protein over the 2-week study period . CONCLUSIONS The amount of additional water needed to achieve a urine osmolality target can be approximated from the urine osmolar excretion in ADPKD patients eating typical diets , providing a quantitative method to prescribe supplemental water for such individuals
11,099
31,699,066
Conclusion The meta- analysis suggests that the ADRB2 rs1042714 polymorphism has a protective association with asthma in the overall population and the pediatric subgroup
Background The published data on the association between β2-adrenergic receptor gene polymorphisms and asthma susceptibility are inconclusive . To derive a more precise estimation of this association , a meta- analysis was performed . G46A , p .
BACKGROUND The issue of whether regular use of an inhaled beta2-adrenergic agonist worsens airflow and clinical outcomes in asthma is controversial . Retrospective studies have suggested that adverse effects occur in patients with a genetic polymorphism that results in homozygosity for arginine ( Arg/Arg ) , rather than glycine ( Gly/Gly ) , at aminoacid residue 16 of the beta2-adrenergic receptor . However , the existence of any genotype-dependent difference has not been tested in a prospect i ve clinical trial . METHODS Patients with mild asthma , not using a controller medication , were enrolled in pairs matched for forced expiratory volume in 1 s ( FEV1 ) according to whether they had the Arg/Arg ( n=37 ; four of 41 matches withdrew before r and omisation ) or Gly/Gly ( n=41 ) genotype . Regularly scheduled treatment with albuterol or placebo was given in a masked , cross-over design , for 16-week periods . During the study , as-needed albuterol use was discontinued and ipratropium bromide was used as needed . Morning peak expiratory flow rate ( PEFR ) was the primary outcome variable . The primary comparisons were between treatment period for each genotype ; the secondary outcome was a treatment by genotype effect . Analyses were by intention to treat . FINDINGS During the run-in period , when albuterol use was kept to a minimum , patients with the Arg/Arg genotype had an increase in morning PEFR of 23 L/min ( p=0.0162 ) ; the change in patients with the Gly/Gly genotype was not significant ( 2 L/min ; p=0.8399 ) . During r and omised treatment , patients with the Gly/Gly genotype had an increase in morning PEFR during treatment with regularly scheduled albuterol compared with placebo ( 14 L/min [ 95 % CI 3 to 25 ] ; p=0.0175 ) . By contrast , patients with the Arg/Arg genotype had lower morning PEFR during treatment with albuterol than during the placebo period , when albuterol use was limited ( -10 L/min [ -19 to -2 ] ; p=0.0209 ) . The genotype-attributable treatment difference was therefore -24 L/min ( -37 to -12 ; p=0.0003 ) . There were similar genotype-specific effects in FEV1 , symptoms , and use of supplementary reliever medication . INTERPRETATION Genotype at the 16th aminoacid residue of the beta2-adrenergic receptor affects the long-term response to albuterol use . Bronchodilator treatments avoiding albuterol may be appropriate for patients with the Arg/Arg genotype The number of published systematic review s of studies of healthcare interventions has increased rapidly and these are used extensively for clinical and policy decisions . Systematic review s are subject to a range of biases and increasingly include non-r and omised studies of interventions . It is important that users can distinguish high quality review s. Many instruments have been design ed to evaluate different aspects of review s , but there are few comprehensive critical appraisal instruments . AMSTAR was developed to evaluate systematic review s of r and omised trials . In this paper , we report on the updating of AMSTAR and its adaptation to enable more detailed assessment of systematic review s that include r and omised or non-r and omised studies of healthcare interventions , or both . With moves to base more decisions on real world observational evidence we believe that AMSTAR 2 will assist decision makers in the identification of high quality systematic review s , including those based on non-r and omised studies of healthcare interventions BACKGROUND The study of determinants of asthma is a subject of much interest currently , especially the pharmacogenetic aspects of asthma management . Genetic polymorphisms affecting amino-acids at positions 16 and 27 within beta(2)-adrenoceptor ( beta(2)AR ) gene have been implicated in the asthma phenotypes and influence on the variability observed in response to use of bronchodilator agents used in the treatment of asthma . Whether these polymorphisms alter the bronchoprotection response to beta(2)-agonist treatment in Spanish asthmatic population is unknown . The aim of this study was to investigate whether genetic polymorphisms within beta(2)AR gene modulate the clinical outcomes of the individual response to beta(2)-agonist therapy and the development of desensitization in Spanish asthmatic patients . METHODS In a prospect i ve , case-control study were included 80 asthmatic patients . Based on the st and ard criteria , patients were classified into two groups : patients with tachyphylaxis and good responders to beta(2)-agonist therapy . DNA sample s were genotyped for the Arg(16)Gly and Glu(27)Gln alleles within the beta(2)AR gene as well as in 64 control sample s from blood donors . RESULTS Arg(16 ) allele was slightly more frequent within the group with tachyphylaxis ( P=0.039 ) , whereas Gly(16 ) allele carriers were overrepresented within the group of good responders ( 59.7 % , P=0.028 ) . On the other h and , the allele frequency of Gln(27 ) and the proportion of Gln(27 ) carriers was higher within the group with tachyphylaxis ( P=0.010 and 0.049 , respectively ) and Glu(27 ) allele carriers were overrepresented within the group of good responders ( P=0.026 ) . The Arg(16 ) and Gln(27 ) alleles were in strong linkage disequilibrium across this locus , result ing in the occurrence of disease haplotype . CONCLUSIONS The predisposition to develop tachyphylaxis in our population seems to be linked to the Arg(16 ) and Gln(27 ) alleles and to the Arg(16)/Gln(27 ) risk haplotype ( positive association between the presence of the Arg(16 ) and Gln(27 ) alleles and tachyphylaxis ) . The Arg(16 ) allele is perhaps overrepresented due to the strong linkage disequilibrium between both polymorphisms . The presence of the Glu(27 ) allele seems to be a protective factor against tachyphylaxis in this cohort study Inhaled beta-adrenergic agonists are the most commonly used medications for the treatment of asthma although there is evidence that regular use may produce adverse effects in some patients . Polymorphisms of the beta(2)-adrenergic receptor ( beta(2)-AR ) can affect regulation of the receptor . Smaller studies examining the effects of such polymorphisms on the response to beta-agonist therapy have produced inconsistent results . We examined whether polymorphisms at codon 16 ( beta(2)-AR-16 ) and codon 27 ( beta(2)-AR-27 ) of the beta(2)-AR might affect the response to regular versus as-needed use of albuterol by genotyping the 190 asthmatics who had participated in a trial examining the effects of regular versus as needed albuterol use . During the 16-wk treatment period there was a small decline in morning peak expiratory flow in patients homozygous for arginine at B(2)-AR-16 ( Arg/Arg ) who used albuterol regularly . This effect was magnified during a 4-wk run out period , during which all patients returned to using as-needed albuterol , so that by the end of the study Arg Arg patients who had regularly used albuterol had a morning peak expiratory flow 30 . 5 + /- 12.1 L/min lower ( p = 0.012 ) than Arg/Arg patients who had used albuterol on an as needed basis . There was no decline in peak flow with regular use of albuterol in patients who were homozygous for glycine at beta(2)-AR-16 . Evening peak expiratory flow also declined in the Arg/Arg patients who used albuterol regularly but not in those who used albuterol on an as-needed basis . No significant differences in outcomes between regular and as-needed treatment were associated with polymorphisms at position 27 of the beta(2)-AR . No other differences in asthma outcomes that we investigated occurred in relation to these beta(2)-AR polymorphisms . Polymorphisms of the beta(2)-AR may influence airway responses to regular inhaled beta-agonist treatment The prevalence of asthma in children has doubled over the past 25 years.1 Two common polymorphisms exist in the β adrenoceptor at amino acids 16 ( glycine for arginine ) and 27 ( glutamic acid for glutamine ) . Both are functionally relevant in cultured cells , with the glycine 16 form of the receptor showing enhanced downregulation and the glutamic acid 27 form showing attenuated downregulation after exposure to β agonists.2 The glutamine 27 polymorphism is associated with raised IgE concentrations in families with a history of asthma , and with increased reactivity of the airways in people with asthma . 3 4 We measured the prevalence of these polymorphisms in a r and om population of children to identify their importance in the expression of reported asthma . We approached children between the ages of 5 and 15 years ( mean 10.5 years ) and an accompanying parent who were attending the accident and To examine the roles of genetic polymorphism of the β2-adrenergic receptor ( β2AR ) in the relationship between eosinophil ( EOS ) counts and eosinophil cationic protein ( ECP ) counts and lung function , we recruited a r and om sample from the 1996 nationwide survey of asthma prevalence in middle school children . A total of 149 subjects—42 asthmatic children , 38 asthmatics in remission ( no reported attack for more than 12 months ) , and 69 nonasthmatics — completed a physical evaluation , pulmonary function test , and determination of EOS , ECP , and β2AR genotypes at amino acids 16 and 27 . Asthmatic children had higher EOS and ECP than did nonasthmatics . No association was found between asthma and β2AR genotypes . Lung function was significantly and inversely correlated with EOS but not with ECP in asthmatic children . By genotype , an inverse correlation between lung function and EOS was found in asthmatic children with Arg16Arg or Gln27Glu . A nonsignificant but similar inverse correlation was found in asthmatic children with Arg16Gly or Gln27Gln . However , a nonsignificant but positive correlation was found in asthmatic children with Gly16Gly . In conclusion , we suggest that EOS is a better clinical indicator of airway inflammation than ECP when children are not having an asthma attack . The association between an increase of EOS and lower lung function can be differentiated by β2AR genotypes at amino acid 16 To determine the association between the β2‐adrenoceptor polymorphisms at amino acids 16 and 27 and markers of allergic disease and asthma per se in a r and om adult population , and to determine the degree of linkage disequilibrium existing between polymorphisms at amino acid positions 16 , 27 , 164 and nucleic acid residue 523 Abstract Introduction : Genetic mutations in the β2 receptor could alter its functioning and the response to β2 agonists . The study was done to find out the effect of two commonly occurring polymorphisms-Arg16Gly and Gln27Glu , on cause of asthma and on response to nebulized salbutamol in South Indian subjects of asthma . Methods : After baseline measurements of Forced Expiratory Volume in 1st second ( FEV1 ) , Forced Vital Capacity ( FVC ) and Peak Expiratory Flow Rate ( PEFR ) , five mg of nebulized salbutamol was administered and spirometry was repeated . The increase in these parameters was calculated and patients were included for genotyping if the percentage increase in FEV1 was ≥12 % . The frequencies of these polymorphisms in patients were compared with those of healthy volunteers . Results : 112 patients and 127 healthy volunteers were genotyped . The frequencies of the polymorphisms were found to be similar to previously published Dravidian population frequencies . The frequencies of genotypes in asthmatics were similar to healthy volunteers . The increase in FEV1 , FVC and PEFR was similar across various genotypes and haplotypes in both the polymorphisms . The GG-CG haplotype was associated with 3.1 times increased occurrence of asthma ( p value = 0.02 ) . The G allele of the Arg16Gly polymorphism was associated with lower baseline FEV1 , FVC and PEFR values , but these were not statistically significant . Conclusion : The Arg16Gly and Gln27Glu polymorphisms do not determine the occurrence of asthma individually , but the GG-CG haplotype is associated with an increased risk of asthma . There is no effect of the genotypes on the response to nebulized salbutamol